FFT On The Raspi’s GPU

fft

The Raspberry Pi has been around for two years now, and still there’s little the hardware hacker can actually do with the integrated GPU. That just changed, as the Raspberry Pi foundation just announced a library for Fourier transforms using the GPU.

For those of you who haven’t yet taken your DSP course, fourier transforms take a function (or audio signal, radio signal, or what have you) and output the fundamental frequency. It’s damn useful for everything from software defined radios to guitar pedals, and the new GPU_FFT library is about ten times faster at this task than the Raspi’s CPU.

You can get a copy of  the GPU_FFT library by running rpi-update on your pi. If you happen to build anything interesting – something with a software defined radio or even a guitar pedal – you’re more than welcome to send it in to the Hackaday tips line. We’d love to see what you’re up to.

An Open Source GPU

Unless you’re bit-banging a CRT interface or using a bunch of resistors to connect a VGA monitor to your project, odds are you’re using proprietary hardware as a graphics engine. The GPU on the Raspberry Pi is locked up under an NDA, and the dream of an open source graphics processor has yet to be realized. [Frank Bruno] at Silicon Spectrum thinks he has the solution to that: a completely open source GPU implemented on an FPGA.

Right now, [Frank] has a very lightweight 2D and 3D engine well-suited for everything from servers to embedded devices. If their Kickstarter meets its goal, they’ll release their project to the world, giving every developer and hardware hacker out there a complete, fully functional, open source GPU.

Given the difficulties [Bunnie] had finding a GPU that doesn’t require an NDA to develop for, we’re thinking this is an awesome project that gets away from the closed-source binary blobs found on the Raspberry Pi and other ARM dev boards.

A Macbook Air And A Thunderbolt GPU

When Intel and Apple released Thunderbolt, hallelujahs from the Apple choir were heard. Since very little in any of Apple’s hardware lineup is upgradeable, an external video card is the best of all possible world. Unfortunately, Intel doesn’t seem to be taking kindly to the idea of external GPUs. That hasn’t stopped a few creative people like [Larry Gadea] from figuring it out on their own. Right now he’s running a GTX 570 through the Thunderbolt port of his MacBook Air, and displaying everything on the internal LCD. A dream come true.

[Larry] is doing this with a few fairly specialized bits of hardware. The first is a Thunderbolt to ExpressCard/34 adapter, after that an ExpressCard to PCI-E adapter. Couple that with a power supply, GPU, and a whole lot of software configuration, and [Larry] had a real Thunderbolt GPU on his hands.

There are, of course, a few downsides to running a GPU through a Thunderbolt port. The current Thunderbolt spec is equivalent to a PCI-E 4X slot, a quarter of what is needed to get all the horsepower out of high-end GPUs. That being said, it is an elegant-yet-kludgy way for better graphics performance on the MBA,

Demo video below.

Continue reading “A Macbook Air And A Thunderbolt GPU”

Veronica VGA Board Finalized

veronica-vga-board-finalized

The latest update in the Veronica 6502 computer project is this finalized VGA board which now has a home in the machine’s backplane.

We’ve been glued to the updates [Quinn Dunki] has been posting about the project for many months now. Getting the GPU working proved to take quite a bit of time, but we learned a ton just by following along. The video output had humble beginnings way back in March. That breadboarded circuit got complicated very quickly and that was before it was even interfaced with the CPU. As you can see from the image above, etching and populating the GPU board really cleans up the build. We’re sure it’s robust enough to move around at this point. We wonder if she’s planning on showing it off at a Maker Faire or another geeky gathering?

It really has become clear how wise [Quinn] was to design a backplane board early on. It plays right into the modular concept. She was even smart enough to include that SIL pin header on the near side of the board which was used heavily while prototyping this video module.

Interfacing A GPU With A CPU

interfacing-a-gpu-with-a-cpu

[Quinn Dunki] pulled together many months worth of work by interfacing her GPU with the CPU. This is one of the major points in her Veronica project which aims to build a computer from the ground up.

We’ve seen quite a number of posts from her regarding the AVR-powered GPU. So far the development of that component has been happening separately from the 6502 centered CPU. But putting them together is anything but trivial. The timing issues that were so important to consider when developing the GPU get even hairier when it comes writing to the VRAM from an external component. Her first thought was to share a portion of the external RAM between the CPU and GPU as a way to push rendering commands from one to the other. This proved troublesome both in timing and in the number of pins available on the AVR chip. She ended up using something of a virtual register on the AVR chip that can receive commands from the CPU asynchronously. Timing dictates that these commands be written only during vertical blanking so this virtual register also acts as a status register to let the CPU know when it can send the next command.

Her post is packed with the theory behind the design, timing tests on the oscilloscope, and a rather intimidating schematic. But the most important part is the video showing her success in the end.

25 GPUs Brute Force 348 Billion Hashes Per Second To Crack Your Passwords

It’s our understanding that the video game industry has long been a driving force in new and better graphics processing hardware. But they’re not the only benefactors to these advances. As we’ve heard before, a graphics processing unit is uniquely qualified to process encryption hashes quickly (we’ve seen this with bitcoin mining). This project strings together 25 GPU cards in 5 servers to form a super fast brute force attack. It’s so fast that the actual specs are beyond our comprehension. How can one understand 348 billion hashes per second?

The testing was used on a collection of password hashes using LM and NTLM protocols. The NTLM is a bit stronger and fared better than the LM, but that’s not actually saying much. An eight character NTLM password will fall in 5.5 hours, while a 14 character LM hash makes it only about six minutes before the solution is discovered. Of course this type of hardware is only good if you have a copy of the password hashes themselves. Login protocols will lock out after a certain number of attempts and have measures in place to slow down automated systems like this one.

[via Boing Boing]

PC Temperature Monitoring System Lights Up When Things Get Hot

gpu_overheating_warning_system

[Taylor] popped a new graphics card into his computer, but before he could settle in for a round of gaming, his card started to overheat. He eventually tracked the problem down to an undersized power supply, but the prospect of cooking his new GPU to death made him think twice about how he was monitoring his system’s health.

To continually keep tabs on his video card’s temperature going forward, he put together a small circuit that will alert him if things start to get too hot. He mounted a small temperature sensor on his graphics card near the GPU, wiring it to an Arduino. The Arduino monitors his video card, lighting an RGB LED blue if conditions are alright. If the temperature rises above 50C, the LED changes to red, signaling a problem.

We’re aware that there are all sorts of software applications that can monitor component temperatures for you, but the appeal of [Taylor’s] system is that it can be easily seen from across the room rather than via the desktop. That said, we think that his system could take advantage of his PC’s case fan lighting for a more visible warning, and it wouldn’t hurt to wire in an auto-shutdown feature in case his computer overheats while he’s away.