Open Data Cam Combines Camera, GPU, And Neural Network In An Artisanal DIY Cereal Box

The engineers and product designers at [moovel lab] have created the Open Data Cam – an AI camera platform that can identify and count objects as they move through its field of view – along with an open source guide for making your own.

Step one: get out your ruler and utility knife. In this world of ubiquitous 3D-printers they’ve taken a decidedly low-tech approach to the project’s enclosure: a cut, folded, and zip-tied plastic box, with a cardboard frame inside to hold the electronic bits. It’s “splash proof” and certainly cheap to make, but we’re a little worried about cooling and physical protection for the electronics inside, as they’re not exactly cheap and rugged components.

So what’s inside? An Nvidia Jetson TX2 board, a LiPo battery with some charging circuitry, and a standard webcam. The special sauce, however, is the software, which is available on GitHub. [Moovel lab]’s engineers have put together a nice-looking wifi-accessible mobile UI for marking the areas where you’d like the software to identify and tally objects. The actual object detection and identification tasks are performed by the speedy YOLO neural network, a task the Nvidia board’s GPU is of course well suited for.

As the Open Data Cam’s unblinking glass eye gazes upon our urban environments, it will log its observations in an ancient and mysterious language: CSV. It’s up to you, human, to interpret this information and use it for good.

A summary video and build time lapse are embedded after the break.

Continue reading “Open Data Cam Combines Camera, GPU, And Neural Network In An Artisanal DIY Cereal Box”

Retro Rebuild Recreates SGI Workstation Demos On The Go

When [Lawrence] showed us the Alice4 after Maker Faire Bay Area last weekend it wasn’t apparent how special the system was. The case is clean and white, adorned only with a big red button below a 7″ screen with a power switch around the back. When the switch is flicked the system boots to display a familiar animation and drops you at a menu. Poking around from here elicits a variety of self-contained graphics demos, some interactive. So this is a Raspberry Pi in a box playing videos, right? Not even close.

Often retro computing focuses on personal computer systems. When they were new the 8-bit graphics or intricate 2D sprites were state of the art, but now their appeal tends towards learning opportunities and the thrill of nostalgia. This may still be true of Alice4, the system [Brad, Lawrence, Mike, and Chris] put together to run Silicon Graphics (SGI) demos from the mid 1980’s but it’s not the whole story. [Lawrence] and [Brad] had both worked at SGI during its heyday and had fond memories of the graphics demos that shipped with those mammoth workstation. So they built Alice4 from the FPGA up to run those very same demos in real-time.

Thanks to Moore’s law, today’s embedded systems put yesterday’s powerhouses within reach. [Lawrence] and [Brad] found the old demo code in a ratty FTP server, and tailor-made Alice4’s software and hardware to run them natively. [Brad] wrote a libgl which implements the subset of the IrisGL API needed to support their selected set of demos. The libgl emits sets of triangles to the SDRAM where [Lawrence’s] HDL running on the onboard FPGA fetches them to interpolate color and depth and draw the result on-screen. Together they allow the $99 Altera Cyclone V development board at Alice4’s heart to run these state of the art demos in the palm of your hand.

Alice4 is open source and extensively documented. Peruse the archeology of reverse engineering the graphics API or the discussion of FIFO design in the FPGA. If those don’t sate your appetite check out a video of Alice4 in action after the break.

Continue reading “Retro Rebuild Recreates SGI Workstation Demos On The Go”

High End PC Gets A Rustic Woodworking Piece Of Art For A Case

As [Matt] from [DIY Perks] was about to assemble a new PC, he decided to take a unique direction when it came to building a case. Despite the appearance of a woodworking piece with weird industrial radiators, there is actually a full-fledged, high-end PC hidden inside.

Those radiators are a pair of almost-the-biggest-you-can-buy heatsinks — one of which has been modified to fit the graphics card. Separating the graphics card’s stock cooling fan unit cut down significantly on noise and works with the stringent space requirements of the build. Those fans however keep other components on the card cool, so [Matt] cut pieces of copper plate to affix to these areas and joined them to the heatsink with a heat pipe, bent to shape. The elm wood case then began to take shape around the graphics card — cut into pieces to accommodate the heat pipes, and sealed with black tack to dampen the ‘coil whine’ of the GPU; it turns out the likely culprit are the MOSFETs, but close enough.

Continue reading “High End PC Gets A Rustic Woodworking Piece Of Art For A Case”

Modder Puts Computer Inside A Power Supply

When building a custom computer rig, most people put the SMPS power supply inside the computer case. [James] a.k.a [Aibohphobia] a.k.a [fearofpalindromes] turned it inside out, and built the STX160.0 – a full-fledged gaming computer stuffed inside a ATX power supply enclosure. While Small Form Factor (SFF) computers are nothing new, his build packs a powerful punch in a small enclosure and is a great example of computer modding, hacker ingenuity and engineering. The finished computer uses a Mini-ITX form factor motherboard with Intel i5 6500T quad-core 2.2GHz processor, EVGA GTX 1060 SC graphics card, 16GB DDR4 RAM, 250GB SSD, WiFi card and two USB ports — all powered from a 160 W AC-DC converter. Its external dimensions are the same as an ATX-EPS power supply at 150 L x 86 H x 230 D mm. The STX160.0 is mains utility powered and not from an external brick, which [James] feels would have been cheating.

For those who would like a quick, TL;DR pictorial review, head over to his photo album on Imgur first, to feast on pictures of the completed computer and its innards. But the Devil is in the details, so check out the forum thread for a ton of interesting build information, component sources, tricks and trivia. For example, to connect the graphics card to the motherboard, he used a “M.2 to powered PCIe x4 adapter” coupled with a flexible cable extender from a quaint company called Adex Electronics who still prefer to do business the old-fashioned way and whose website might remind you of the days when Netscape Navigator was the dominant browser.

As a benchmark, [James] posts that “with the cover panel on, at full load (Prime95 Blend @ 2 threads and FurMark 1080p 4x AA) the CPU is around 65°C with the CPU fan going at 1700RPM, and the GPU is at 64°C at 48% fan speed.” Fairly impressive for what could be passed off at first glance as a power supply.

The two really interesting take away’s for us in this project are his meticulous research to find specific parts that met his requirements from among the vast number of available choices. The second is his extremely detailed notes on designing the custom enclosure for this project and make it DFM (design for manufacturing) friendly so it could be mass-produced – just take a look at his “Table of Contents” for a taste of the amount of ground he is covering. If you are interested in custom builds and computer modding, there is a huge amount of useful information embedded in there for you.

Thanks to [Arsenio Dev‏] who posted a link to this hilarious thread on Reddit discussing the STX160.0. Check out a full teardown and review of the STX160.0 by [Not for Concentrate] in the video after the break.

Continue reading “Modder Puts Computer Inside A Power Supply”

TensorFlow Lite demos

Smarter Phones In Your Hacks With TensorFlow Lite

One way to run a compute-intensive neural network on a hack has been to put a decent laptop onboard. But wouldn’t it be great if you could go smaller and cheaper by using a phone instead? If your neural network was written using Google’s TensorFlow framework then you’ve had the option of using TensorFlow Mobile, but it doesn’t use any of the phone’s accelerated hardware, and so it might not have been fast enough.

TensorFlow Lite architecture
TensorFlow Lite architecture

Google has just released a new solution, the developer preview of TensofFlow Lite for iOS and Android and announced plans to support Raspberry Pi 3. On Android, the bottom layer is the Android Neural Networks API which makes use of the phone’s DSP, GPU and/or any other specialized hardware to speed up computations. Failing that, it falls back on the CPU.

Currently, fewer operators are supported than with TensforFlor Mobile, but more will be added. (Most of what you do in TensorFlow is done through operators, or ops. See our introduction to TensorFlow article if you need a refresher on how TensorFlow works.) The Lite version is intended to be the successor to Mobile. As with Mobile, you’d only do inference on the device. That means you’d train the neural network elsewhere, perhaps on a GPU-rich desktop or on a GPU farm over the network, and then make use of the trained network on your device.

What are we envisioning here? How about replacing the MacBook Pro on the self-driving RC cars we’ve talked about with a much smaller, lighter and less power-hungry Android phone? The phone even has a camera and an IMU built-in, though you’d need a way to talk to the rest of the hardware in lieu of GPIO.

You can try out TensorFlow Lite fairly easily by going to their GitHub and downloading a pre-built binary. We suspect that’s what was done to produce the first of the demonstration videos below.

Continue reading “Smarter Phones In Your Hacks With TensorFlow Lite”

Interactive Visual Programming With Vvvv

Did you ever feel the urge to turn the power of image processing and OCR into music? Maybe you wanted to use motion capture to illustrate the dynamic movement of a kung-fu master in stunning images like the one above?  Both projects were created with the same software.

vvvv -pronounced ‘four vee’, ‘vee four’ and sometimes even ‘veeveeveevee’- calls itself ‘a multi purpose framework’, which is as vague and correct as calling a computer ‘a device that performs calculations’. What can it do, and what does the framework look like? I’d like to show you.

Since its first release in 1998 the project has never officially left beta stage. This doesn’t mean the recent beta releases are unstable, it’s just that the people behind vvvv refrain from declaring their software ‘finished’. It also provides an excuse for some quirks, such as requiring 7-zip to unpack the binaries and the UI that takes some getting used to. vvvv requires DirectX and as such is limited to Windows.

With the bad stuff out of the way, let’s take a look what vvvv can do. First, as implied by the close relationship with DirectX, it’s really good at producing graphics. An example for interactive video is embedded below the break. With its data flow/ visual programming approach it also lends itself to rapid prototyping or live coding. Modifications to a patch, as programs are called in this context, immediately affect the output.

The name ‘patch’ harkens back to the times of analog synthesizers and working with vvvv has indeed some similarities with signal processing that will make the DSP nerds among you feel right at home.

Continue reading “Interactive Visual Programming With Vvvv”

Solving Mazes With Graphics Cards

What if we told you that you are likely to have more computers than you think? And we are not talking about things that are computers while not looking like one, like most modern cars or certain lightbulbs. We are talking about the powerful machines hiding in your desktop computer called ‘graphics card’. In the ordinary gaming rig graphics cards that are much more powerful than the machine they’re built into are a common occurrence. In his tutorial [Viktor Chlumský] demonstrates how to harness your GPU’s power to solve a maze.

Software that runs on a GPU is called a shader. In this example a shader is shown that finds the way through a maze. We also get to catch a glimpse at the limitations that make this field of software special: [Viktor]’s solution has to work with only four variables, because all information is stored in the red, green, blue and alpha channels of an image. The alpha channel represents the boundaries of the maze. Red and green channels are used to broadcast waves from the beginning and end points of the maze. Where these two waves meet is the shortest solution, a value which is captured through the blue channel.

Despite having tons of cores and large memory, programming shaders feels a lot like working on microcontrollers. See for yourself in the maze solving walk through below.

Continue reading “Solving Mazes With Graphics Cards”