Solving Mazes With Graphics Cards

What if we told you that you are likely to have more computers than you think? And we are not talking about things that are computers while not looking like one, like most modern cars or certain lightbulbs. We are talking about the powerful machines hiding in your desktop computer called ‘graphics card’. In the ordinary gaming rig graphics cards that are much more powerful than the machine they’re built into are a common occurrence. In his tutorial [Viktor Chlumský] demonstrates how to harness your GPU’s power to solve a maze.

Software that runs on a GPU is called a shader. In this example a shader is shown that finds the way through a maze. We also get to catch a glimpse at the limitations that make this field of software special: [Viktor]’s solution has to work with only four variables, because all information is stored in the red, green, blue and alpha channels of an image. The alpha channel represents the boundaries of the maze. Red and green channels are used to broadcast waves from the beginning and end points of the maze. Where these two waves meet is the shortest solution, a value which is captured through the blue channel.

Despite having tons of cores and large memory, programming shaders feels a lot like working on microcontrollers. See for yourself in the maze solving walk through below.

Continue reading “Solving Mazes With Graphics Cards”

Cryptocurrency Mining Post-Bitcoin

While the age of using your own computer to mine Bitcoin during spare CPU cycles has long passed, average folks aren’t entirely shut out of the cryptocurrency game yet. Luckily, Bitcoin isn’t the only game in town anymore, and with GPUs coming down in price it’s possible to build a mining rig for other currencies like Etherium.

[Chris]’s build starts with some extruded aluminum and a handful of GPUs. He wanted to build something that didn’t take up too much space in the small apartment. Once the main computer was installed, each GPU was installed upwards in the rack, with each set having its own dedicated fan. After installing a fan controller and some plexiglass the rig was up and running, although [Chris] did have to finagle the software a little bit to get all of the GPUs to work properly.

While this build did use some tools that might only be available at a makerspace, like a mill and a 3D printer, the hardware is still within reason with someone with a little cash burning a hole in their pockets. And, if Etherium keeps going up in value like it has been since the summer, it might pay for itself eventually, providing that your electric utility doesn’t charge too much for power.

And if you missed it, we just ran a feature on Etherium.  Check it out.

Neural Nets In The Browser: Why Not?

We keep seeing more and more Tensor Flow neural network projects. We also keep seeing more and more things running in the browser. You don’t have to be Mr. Spock to see this one coming. TensorFire runs neural networks in the browser and claims that WebGL allows it to run as quickly as it would on the user’s desktop computer. The main page is a demo that stylizes images, but if you want more detail you’ll probably want to visit the project page, instead. You might also enjoy the video from one of the creators, [Kevin Kwok], below.

TensorFire has two parts: a low-level language for writing massively parallel WebGL shaders that operate on 4D tensors and a high-level library for importing models from Keras or TensorFlow. The authors claim it will work on any GPU and–in some cases–will be actually faster than running native TensorFlow.

Continue reading “Neural Nets In The Browser: Why Not?”

Using The GPU From JavaScript

Everyone knows that writing programs that exploit the GPU (Graphics Processing Unit) in your computer’s video card requires special arcane tools, right? Well, thanks to [Matthew Saw], [Fazil Sapuan], and [Cheah Eugene], perhaps not. At a hackathon, they turned out a Javascript library that allows you to create “kernel” functions to execute on the GPU of the target system. There’s a demo available with a benchmark which on our machine sped up a 512×512 calculation by well over five times. You can download the library from the same page. There’s also a GitHub page.

The documentation is a bit sparse but readable. You simply define the function you want to execute and the dimensions of the problem. You can specify one, two, or three dimensions, as suits your problem space. When you execute the associated function it will try to run the kernels on your GPU in parallel. If it can’t, it will still get the right answer, just slowly.

Continue reading “Using The GPU From JavaScript”

World’s Worst Bitcoin Mining Rig

Even if we don’t quite understand what’s happening in a Bitcoin mine, we all pretty much know what’s needed to set one up. Racks of GPUs and specialized software will eventually find a few of these vanishingly rare virtual treasures, but if you have enough time, even a Xerox Alto from 1973 can be turned into a Bitcoin mine. As for how much time it’ll take [Ken Shirriff]’s rig to find a Bitcoin, let’s just say that his Alto would need to survive the heat death of the universe. About 5000 times. And it would take the electricity generated by a small country to do it.

Even though it’s not exactly a profit center, it gives [Ken] a chance to show off his lovingly restored Alto. The Xerox machine is the granddaddy of all modern PCs, having introduced almost every aspect of the GUI world we live in. But with a processor built from discrete TTL chips and an instruction set that doesn’t even have logical OR or XOR functions, the machine isn’t exactly optimized for SHA-256 hashing. The fact that [Ken] was able to implement a mining algorithm at all is impressive, and his explanation of how Bitcoin mining is done is quite clear and a great primer for cryptocurrency newbies.

[Ken] seems to enjoy sending old computer hardware to the Bitcoin mines — he made an old IBM mainframe perform the trick a while back. But if you don’t have a room-size computer around, perhaps reading up on alternate uses for the block chain would be a good idea.

[via Dangerous Prototypes]

Microchip’s PIC32MZ DA — The Microcontroller With A GPU

When it comes to displays, there is a gap between a traditional microcontroller and a Linux system-on-a-chip (SoC). The SoC that lives in a smartphone will always have enough RAM for a framebuffer and usually has a few pins dedicated to an LCD interface. Today, Microchip has announced a microcontroller that blurs the lines between what can be done with an SoC and what can be done with a microcontroller. The PIC32MZ ‘DA’ family of microcontrollers is designed for graphics applications and comes with a boatload of RAM and a dedicated GPU.

The key feature for this chip is a boatload of RAM for a framebuffer and a 2D GPU. The PIC32MZ DA family includes packages with 32 MB of integrated DRAM designed to be used as framebuffers. Support for 24-bit color on SXGA (1280 x 1024) panels is included. There’s also a 2D GPU in there with support for sprites, blitting, alpha blending, line drawing, and filling rectangles. No, it can’t play Crysis — just to get that meme out of the way — but it is an excellent platform for GUIs.

Continue reading “Microchip’s PIC32MZ DA — The Microcontroller With A GPU”

Neural Networks: You’ve Got It So Easy

Neural networks are all the rage right now with increasing numbers of hackers, students, researchers, and businesses getting involved. The last resurgence was in the 80s and 90s, when there was little or no World Wide Web and few neural network tools. The current resurgence started around 2006. From a hacker’s perspective, what tools and other resources were available back then, what’s available now, and what should we expect for the future? For myself, a GPU on the Raspberry Pi would be nice.

Continue reading “Neural Networks: You’ve Got It So Easy”