Open Data Cam Combines Camera, GPU, And Neural Network In An Artisanal DIY Cereal Box

The engineers and product designers at [moovel lab] have created the Open Data Cam – an AI camera platform that can identify and count objects as they move through its field of view – along with an open source guide for making your own.

Step one: get out your ruler and utility knife. In this world of ubiquitous 3D-printers they’ve taken a decidedly low-tech approach to the project’s enclosure: a cut, folded, and zip-tied plastic box, with a cardboard frame inside to hold the electronic bits. It’s “splash proof” and certainly cheap to make, but we’re a little worried about cooling and physical protection for the electronics inside, as they’re not exactly cheap and rugged components.

So what’s inside? An Nvidia Jetson TX2 board, a LiPo battery with some charging circuitry, and a standard webcam. The special sauce, however, is the software, which is available on GitHub. [Moovel lab]’s engineers have put together a nice-looking wifi-accessible mobile UI for marking the areas where you’d like the software to identify and tally objects. The actual object detection and identification tasks are performed by the speedy YOLO neural network, a task the Nvidia board’s GPU is of course well suited for.

As the Open Data Cam’s unblinking glass eye gazes upon our urban environments, it will log its observations in an ancient and mysterious language: CSV. It’s up to you, human, to interpret this information and use it for good.

A summary video and build time lapse are embedded after the break.

Continue reading “Open Data Cam Combines Camera, GPU, And Neural Network In An Artisanal DIY Cereal Box”

Stretched PC Case Turned GPU Cryptominer

We don’t do financial planning here at Hackaday, so we won’t weigh in on the viability of making money mining cryptocurrency in such a volatile market. But we will say that if you’re going to build a machine to hammer away at generating Magical Internet Monies, you might as well make it cool. Even if you don’t turn a profit, at least you’ll have something interesting to look at while you weep over your electricity bill.

Sick of seeing the desktop machine he built a decade ago gathering dust, [plaggle24w5] decided to use it as the base for a cryptocurrency mining rig. Of course, none of the original internals would do him any good, but the case itself ended up being a useful base to expand on. With the addition of some 3D printed components, he stretched out the case and installed an array of video cards.

To start with, all the original plastic was ripped off, leaving just the bare steel case. He then jammed a second power supply into the original optical drive bays to provide the extra power those thirsty GPUs would soon be sucking down. He then designed some 3D printed arms which would push out the side panel of the case far enough that he could mount the video cards vertically alongside the case. Three case fans were then added to the bottom to blow air through the cards.

While [plaggle24w5] mentions this arrangement does work with the case standing up, there’s obviously not a lot of air getting to the fans on the bottom when they’re only an inch or so off the ground. Turning the case on its side, with the motherboard parallel to the floor, allows for much better airflow and results in a measurable dip in operating temperature. Just hope you never drop anything down onto the exposed motherboard…

Mining Bitcoin on desktop computers might be a distant memory, but the latest crop of cryptocurrencies are (for now) giving new players a chance to relive those heady early days.

Ethereum: GPU Mining Is Back But For How Long?

By now, everyone and their dog has at least heard of Bitcoin. While no government will accept tax payments in Bitcoin just yet, it’s ridiculously close to being real money. We’ve even paid for pizza delivery in Bitcoin. But it’s not the only cryptocurrency in town.

Ethereum initially launched in 2015 is an open source, it has been making headway among the 900 or so Bitcoin clones and is the number two cryptocurrency in the world, with only Bitcoin beating it in value. This year alone, the Ether has risen in value by around 4000%, and at time of writing is worth $375 per coin. And while the Bitcoin world is dominated by professional, purpose-built mining rigs, there is still room in the Ethereum ecosystem for the little guy or gal.

Ethereum is for Hackers

There may be many factors behind Ethereum’s popularity, however one reason is that the algorithm is designed to be resistant to ASIC mining. Unlike Bitcoin, anyone with a half decent graphics card or decent gaming rig can mine Ether, giving them the chance to make some digital currency. This is largely because mining Ethereum coins requires lots of high-speed memory, which ASICs lack. The algorithm also has built-in ASIC detection and will refuse to mine properly on them.

Small-scale Bitcoin miners were stung when the mining technology jumped from GPU to ASICs. ASIC-based miners simply outperformed the home gamer, and individuals suddenly discovered that their rigs were not worth much since there was a stampede of people trying to sell off their high-end GPU’s all at once. Some would go on to buy or build an ASIC but the vast majority just stopped mining. They were out of the game they couldn’t compete with ASICs and be profitable since mining in its self uses huge amounts of electricity.

Economies of scale like those in Bitcoin mining tend to favor a small number of very large players, which is in tension with the distributed nature of cryptocurrencies which relies on consensus to validate transactions. It’s much easier to imagine that a small number of large players would collude to manipulate the currency, for instance. Ethereum on the other hand hopes to keep their miners GPU-based to avoid huge mining farms and give the average Joe a chance at scoring big and discovering a coin on their own computer.

Ethereum Matters

Ethereum’s rise to popularity has basically undone Bitcoin’s move to ASICs, at least in the gamer and graphics card markets. Suddenly, used high-end graphics cards are worth something again. And there are effects in new equipment market. For instance, AMD cards seem to outperform other cards at the moment and they are taking advantage of this with their release of Mining specific GPU drivers for their new Vega architecture. Indeed, even though AMD bundled its hottest RX Vega 64 GPU with two games, a motherboard, and a CPU in an attempt to make the package more appealing to gamers than miners, AMD’s Radeon RX Vega 56 sold out in five minutes with Ethereum miners being blamed.

Besides creating ripples in the market for high-end gaming computers, cryptocurrencies are probably going to be relevant in the broader economy, and Ethereum is number two for now. In a world where even banks are starting to take out patents on blockchain technology in an attempt to get in on the action, cryptocurrencies aren’t as much of a fringe pursuit as they were a few years ago. Ethereum’s ASIC resistance is perhaps its killer feature, preventing centralization of control and keeping the little hacker in the mining game. Only time will tell if it’s going to be a Bitcoin contender, but it’s certainly worth keeping your eye on.

Using The GPU From JavaScript

Everyone knows that writing programs that exploit the GPU (Graphics Processing Unit) in your computer’s video card requires special arcane tools, right? Well, thanks to [Matthew Saw], [Fazil Sapuan], and [Cheah Eugene], perhaps not. At a hackathon, they turned out a Javascript library that allows you to create “kernel” functions to execute on the GPU of the target system. There’s a demo available with a benchmark which on our machine sped up a 512×512 calculation by well over five times. You can download the library from the same page. There’s also a GitHub page.

The documentation is a bit sparse but readable. You simply define the function you want to execute and the dimensions of the problem. You can specify one, two, or three dimensions, as suits your problem space. When you execute the associated function it will try to run the kernels on your GPU in parallel. If it can’t, it will still get the right answer, just slowly.

Continue reading “Using The GPU From JavaScript”

Microchip’s PIC32MZ DA — The Microcontroller With A GPU

When it comes to displays, there is a gap between a traditional microcontroller and a Linux system-on-a-chip (SoC). The SoC that lives in a smartphone will always have enough RAM for a framebuffer and usually has a few pins dedicated to an LCD interface. Today, Microchip has announced a microcontroller that blurs the lines between what can be done with an SoC and what can be done with a microcontroller. The PIC32MZ ‘DA’ family of microcontrollers is designed for graphics applications and comes with a boatload of RAM and a dedicated GPU.

The key feature for this chip is a boatload of RAM for a framebuffer and a 2D GPU. The PIC32MZ DA family includes packages with 32 MB of integrated DRAM designed to be used as framebuffers. Support for 24-bit color on SXGA (1280 x 1024) panels is included. There’s also a 2D GPU in there with support for sprites, blitting, alpha blending, line drawing, and filling rectangles. No, it can’t play Crysis — just to get that meme out of the way — but it is an excellent platform for GUIs.

Continue reading “Microchip’s PIC32MZ DA — The Microcontroller With A GPU”

Peering Inside The GPU Black Box

Researchers at Binghamton University have built their own graphics processor unit (GPU) that can be flashed into an FGPA. While “graphics” is in the name, this GPU design aims to provide a general-purpose computing peripheral, a GPGPU testbed. Of course, that doesn’t mean that you can’t play Quake (slowly) on it.

The Binghamton crew’s design is not only open, but easily modifiable. It’s a GPGPU where you not only know what’s going on inside the silicon, but also have open-source drivers and interfaces. As Prof. [Timothy Miller] says,

 It was bad for the open-source community that GPU manufacturers had all decided to keep their chip specifications secret. That prevented open source developers from writing software that could utilize that hardware. With contributions from the ‘open hardware’ community, we can incorporate more creative ideas and produce an increasingly better tool.

That’s where you come in. [Jeff Bush], a member of the team, has a great blog with a detailed walk-through of a known GPU design. All of the Verilog and C++ code is up on [Jeff]’s GitHub, including documentation.

If you’re interested in the deep magic that goes on inside GPUs, here’s a great way to peek inside the black box.

Shmoocon 2016: GPUs And FPGAs To Better Detect Malware

One of the big problems in detecting malware is that there are so many different forms of the same malicious code. This problem of polymorphism is what led Rick Wesson to develop icewater, a clustering technique that identifies malware.

Presented at Shmoocon 2016, the icewater project is a new way to process and filter the vast number of samples one finds on the Internet. Processing 300,000 new samples a day to determine if they have polymorphic malware in them is a daunting task. The approach used here is to create a fingerprint from each binary sample by using a space-filling curve. Polymorphism will change a lot of the bits in each sample, but as with human fingerprints, patterns are still present in this binary fingerprints that indicate the sample is a variation on a previously known object.
Continue reading “Shmoocon 2016: GPUs And FPGAs To Better Detect Malware”