Hacklet 15 – Arcade Fire

This week’s Hacklet is dedicated to arcade games. The arcade parlors of the 80’s and early 90’s may have given way to today’s consoles and PC games, but the classic stand-up arcade cabinet lives on! Plenty of hackers have restored old arcade cabinets, or even built their own. We’re going to take a look at some of the best arcade game-related hacks on Hackaday.io!

blackvortex[Brayden] starts things off with his Raspberry Pi Vintage Arcade. The Black Vortex is a tabletop arcade cabinet using a Raspberry Pi, an old monitor, and some nice carpentry skills. Black Vortex uses a Raspberry Pi B+. The extra GPIO pins make interfacing buttons and joystick switches easy. On the software side, [Brayden] is using the popular PiMame (now PiPlay) flavor of Linux built for gaming and emulation. Black Vortex’s shell is plywood. [Brayden] used a pocket hole jig to build a sturdy, cabinet without extra support blocks. A stain finish really works on this one!

custom-crtNext up, [fredkono] blows our minds with the Arcade XY Monitor From Scratch. [fredkono] repairs classic Atari vector game PCBs. He needed a test monitor for his lab. The original Amplifone and WG6100 color XY monitors used in games like Tempest and Star Wars are becoming rather rare. Not a problem, as [fredkono] is building his own. Much like the WG6100, [fredkono] started with a standard color TV CRT. He removed and rewound the yoke for vector operation. The TV’s electronics were replaced with [fredkono’s] own deflection amplifier PCBs.  [fredkono] was sure to include the all- important spot killer circuit, which shuts down the electron guns before a spot can burn-in the CRT.

controlpanel[Rhys] keeps things rolling with a pair of projects dedicated to arcade controls. His TI Launchpad Arcade Control to USB Interface contains instructions and code to use a Texas Instruments Tiva C launchpad as a USB interface for arcade controls. [Rhys] puts all that to good use in his Arcade Control Panel. The control panel features MAME buttons, as well as the standard 2 player fighting game button layout. He finished off his panel with some slick graphics featuring red and blue dragons.

trongame[Sarah and Raymond] hosted a Tron:Legacy release party back in 2010. An epic arcade movie calls for an epic arcade game, or in this case, games. 16 table top arcades to be exact. All 16 machines were built in just 6 days. 8 of the machines ran Armegatron Advanced, a networked version of the classic Tron lightcycle game. The others ran a mix of classic games like PacMan or modern bullet hell shooters like Tou-Hou. The cabinets were built from expanded PVC with wood blocks as a support structure. [Sarah and Raymond] custom painted each cabinet with UV black light paint. We love the custom artwork on their personal signature machines!

mikesArcade[Mike] takes us back to the 80’s with Just Another Arcade Machine. Under the hood, this machine uses the standard Raspberry Pi and PiMame (now PiPlay) suite. [Mike] even added a trackball so he could play Centipede. What makes this arcade special is the cabinet. [Mike] found an old wardrobe with that perfect 80’s style metallic strip cladding. [Mike] removed the cladding, and cut up the chipboard frame. He re-assembled things into a stand-up arcade cabinet that looks like it came right out of Sears’ Electronics department in 1985.

Ok folks, that’s it for another episode of The Hacklet. As always, see you next week. Same hack time, same hack channel, bringing you the best of Hackaday.io!

Open Source Marker Recognition For Augmented Reality

marker

[Bharath] recently uploaded the source code for an OpenCV based pattern recognition platform that can be used for Augmented Reality, or even robots. It was built with C++ and utilized the OpenCV library to translate marker notations within a single frame.

The program started out by focusing in on one object at a time. This method was chosen to eliminate the creation of additional arrays that contained information of all of the blobs inside the image; which could cause some problems.

Although this implementation did not track marker information through multiple frames, it did provide a nice foundation for integrating pattern recognition into computer systems. The tutorial was straightforward and easy to ready. The entire program and source code can be found on Github which comes with a ZERO license so that anyone can use it. A video of the program comes up after the break:

Continue reading “Open Source Marker Recognition For Augmented Reality”

A MIPI DSI Display Shield/HDMI Adapter

MIPI DSI shield

[Tomasz] tipped us about the well documented MIPI DSI Display Shield / HDMI Adapter he put on hackaday.io. The Display Serial Interface (DSI) is a high speed packet-based interface for delivering video data to recent LCD/OLED displays. It uses several differential data lanes which frequencies may reach 1 GHz depending on the resolution and frame rate required.

The board explained in the above diagram therefore allows any HDMI content to be played on the DSI-enabled scrap displays you may have lying around. It includes a 32MB DDR memory which serves as a frame buffer, so your “slow” Arduino platform may have enough time to upload the picture you want to display.

The CP2103 does the USB to UART conversion, allowing your computer to configure the display adapter internal settings. The platform is based around the XC6SLX9 Spartan-6 FPGA and all the source code may be downloaded from the official GitHub repository, along with the schematics and gerbers. After the break we’ve embedded a demonstration video in which a Raspi drives an iPhone 4 LCD.

Continue reading “A MIPI DSI Display Shield/HDMI Adapter”

Open Source GPU Released

GPLGPU

Nearly a year ago, an extremely interesting project hit Kickstarter: an open source GPU, written for an FPGA. For reasons that are obvious in retrospect, the GPL-GPU Kickstarter was not funded, but that doesn’t mean these developers don’t believe in what they’re doing. The first version of this open source graphics processor has now been released, giving anyone with an interest a look at what a late-90s era GPU looks like on the inside, If you’re cool enough, there’s also enough supporting documentation to build your own.

A quick note for the PC Master Race: this thing might run Quake eventually. It’s not a powerhouse. That said, [Bunnie] had a hard time finding an open source GPU for the Novena laptop, and the drivers for the VideoCore IV in the Raspi have only recently been open sourced. A completely open GPU simply doesn’t exist, and short of a few very, very limited thesis projects there hasn’t been anything like this before.

Right now, the GPL-GPU has 3D graphics acceleration working with VGA on a PCI bus. The plan is to update this late-90s setup to interfaces that make a little more sense, and add DVI and HDMI output. Not bad for a failed Kickstarter, right?

Hacking VGA For Trippy Video Effects

RGB.VGA.VOLT

Ever since flat panel LCD monitors came on the scene, most old CRTs have found their ways into the garbage or into the backs of closets. For this project, it might be a good idea to pull out the old monitor or TV out and dust it off! [James] has found a way to hack the VGA input to these devices to get them to display vivid visualizations based on an audio input.

The legacy hardware-based project is called RGB.VGA.VOLT and works by taking an audio signal as an input, crossing some wires, and sending the signal through a synthesizer. The circuit then creates a high-frequency waveform that works especially well for being displayed on VGA. The video can also be channeled back through an audio waveform generator to create a unique sound to go along with the brilliant colors.

[James]’s goals with this project are to generate an aesthetic feeling with his form of art and to encourage others to build upon his work. To that end, he has released the project under an open license, and the project is thoroughly documented on his project site.

There have been plenty of hacks in the past that have implemented other protocols with VGA or implemented VGA on microcontrollers, but none that have hacked the interface entirely to create something that looks like the Star Gate sequence from 2001: A Space Odyssey. We think it’s a great piece of modern art and a novel use of VGA!

Thanks for the tip, [Kyle]!

Sprite Graphics Accelerator On An FPGA

A demo running on a FPGA sprite accelerator

Graphics accelerators move operations to hardware, where they can be executed much faster. This is what allows your Raspberry Pi to display high definition video decently. [Andy]’s latest build is a 2D sprite engine, featuring hardware accelerated graphics on an FPGA.

In the simplest mode, the sprite engine just passes commands through to the LCD. This allows for basic control. The fun part sprite mode, which allows for sprites to be loaded onto the FPGA. At that point, you can show, hide, and move the sprite. By overlapping many sprites, you something like the demo shown above.

The FPGA is from Xilinx, and uses their Block RAM IP to store the state of the sprites. The actual sprite data is contained on a 128 Mb external flash chip, since they require significant space.

The game logic runs on a STM32 Cortex M4 microcontroller which communicates with the FPGA and orders the sprites around. The FPGA then deals with generating frames and sending them to the LCD screen, freeing up the microcontroller.

If you’re wondering about the LCD itself, it’s 3.2″, 640 x 360, and taken from a Ericsson U5 Vivaz cellphone. [Andy] has a detailed writeup on reverse engineering it. After the break, he gives us a video overview of the whole system.

Continue reading “Sprite Graphics Accelerator On An FPGA”

Hyperlapse Makes Your HeadCam Videos Awesome

hyperlapse First person video – between Google Glass, GoPro, and other sports cameras, it seems like everyone has a camera on their head these days. If you’re a surfer or skydiver, that might make for some awesome footage. For the rest of us though, it means hours of boring video. The obvious way to fix this is time-lapse. Typically time-lapse throws frames away. Taking 1 of every 10 frames results in a 10x speed increase. Unfortunately, speeding up a head mounted camera often leads to a video so bouncy it can’t be watched without an air sickness bag handy. [Johannes Kopf], [Michael Cohen], and [Richard Szeliski] at Microsoft Research have come up with a novel solution to this problem with Hyperlapse.

Hyperlapse photography is not a new term. Typically, hyperlapse films require careful planning, camera rigs, and labor-intensive post-production to achieve a usable video. [Johannes] and team have thrown computer vision and graphics algorithms at the problem. The results are nothing short of amazing.

The full details are available in the team’s report (35MB PDF warning). To obtain usable data, the fisheye lenses often used on these cameras must be calibrated. The team accomplished that with the OCamCalib toolbox. Imported video is broken down frame by frame. Using structure from motion algorithms, hyperlapse creates a 3D models of the various scenes in the video. With the scenes in this virtual world, the camera can be moved and aimed at will. The team’s algorithms then pick a smooth path that follows the original cameras trajectory. Once the camera’s position is known, it’s simply a matter of rendering the final video.

The results aren’t perfect. The mountain climbing scenes show some artifacts caused by the camera frame rate and exposure changing due to the varied lighting conditions. People appear and disappear in the bicycling portion of the video.

One thing the team doesn’t mention is how long the process takes. We’re sure this kind of rendering must require some serious time and processing power. Still, the output video is stunning.

Continue reading “Hyperlapse Makes Your HeadCam Videos Awesome”