A Comparison of Early Graphics Cards

We have to admit, we expected to be bored through [The 8-Bit Guy]’s presentation, only to stay riveted through his comparison of early graphic card technology.

Some presentations get a bit technical, which isn’t bad, but what is so interesting about this one is the clear explanation of what the market was like, and what it was like for the user during this time. For example, one bit we found really interesting was the mention of later games not supporting some of the neat color hacks for CGA because they couldn’t emulate it fully on the VGA cards they were developing on. Likewise, It was interesting to see why a standard like RGBI even existed in the first place with his comparison of text in composite, and much clearer text in RGBI.

We learned a lot, and some mysteries about the bizarre color choices in old games make a lot more sense now. Video after the break.

Continue reading “A Comparison of Early Graphics Cards”

The Most Immersive Pinball Machine: Project Supernova

Over at [Truthlabs], a 30 year old pinball machine was diagnosed with a major flaw in its game design: It could only entertain one person at a time. [Dan] and his colleagues set out to change this, transforming the ol’ pinball legend “Firepower” into a spectacular, immersive gaming experience worthy of the 21st century.

A major limitation they wanted to overcome was screen size. A projector mounted to the ceiling should turn the entire wall behind the machine into a massive 15-foot playfield for anyone in the room to enjoy.

 

With so much space to fill, the team assembled a visual concept tailored to blend seamlessly with the original storyline of the arcade classic, studying the machine’s artwork and digging deep into the sci-fi archives. They then translated their ideas into 3D graphics utilizing Cinema4D and WebGL along with the usual designer’s toolbox. Lasers and explosions were added, ready to be triggered by game interactions on the machine.

pinnball-ocr-comp

To hook the augmentation into the pinball machine’s own game progress, they elaborated an elegant solution, incorporating OpenCV and OCR, to read all five of the machine’s 7 segment displays from a single webcam. An Arduino inside the machine taps into the numerous mechanical switches and indicator lamps, keeping a Node.js server updated about pressed buttons, hits, the “Lange Change” and plunged balls.

The result is the impressive demonstration of both passion and skill you can see in the video below. We really like the custom shader effects. How could we ever play pinball without them?

Continue reading “The Most Immersive Pinball Machine: Project Supernova”

Hand Gestures Drive Car

There are a number of ways to control an automobile without using the pedals, and sometimes even without using the steering wheel. Most commonly these alternative control mechanisms are installed in vehicles whose owners are disabled in some way, but [Anurag] has taken this idea of alternative control one step further. He has built a car that can be driven by hand gestures alone.

On a remote controlled car, a Raspberry Pi 2 was installed that handles processing and communication. A wireless network is created on the Pi, and a laptop connects to the Pi over the network. The web camera on the laptop regularly captures frames at 15 fps to check for the driver’s hand gestures. The image is converted to gray scale, thresholded, contours are obtained, and the centroid and farthest points are obtained.

After some calculations are done, a movement decision is taken. The decision is passed to the Pi, which in turn, passed that to the internal chip of the car. All of the code is available on the project’s github page. [Anurag] hopes that this can be scaled up to full sized cars in the future. We’ve seen gesture-based remote controls before that rely on Sonar sensors, so it’s interesting to see one that relies strictly on image processing.

Continue reading “Hand Gestures Drive Car”

Peering Inside the GPU Black Box

Researchers at Binghamton University have built their own graphics processor unit (GPU) that can be flashed into an FGPA. While “graphics” is in the name, this GPU design aims to provide a general-purpose computing peripheral, a GPGPU testbed. Of course, that doesn’t mean that you can’t play Quake (slowly) on it.

The Binghamton crew’s design is not only open, but easily modifiable. It’s a GPGPU where you not only know what’s going on inside the silicon, but also have open-source drivers and interfaces. As Prof. [Timothy Miller] says,

 It was bad for the open-source community that GPU manufacturers had all decided to keep their chip specifications secret. That prevented open source developers from writing software that could utilize that hardware. With contributions from the ‘open hardware’ community, we can incorporate more creative ideas and produce an increasingly better tool.

That’s where you come in. [Jeff Bush], a member of the team, has a great blog with a detailed walk-through of a known GPU design. All of the Verilog and C++ code is up on [Jeff]’s GitHub, including documentation.

If you’re interested in the deep magic that goes on inside GPUs, here’s a great way to peek inside the black box.

Home Made 8mm Digitizer

The 8mm film look is making a comeback, but distributing it is an issue. [Heikki Hietala] wanted an easy way to digitally capture the 8mm movies he made. So, he built an 8mm digitizer from an Arduino, a cheap Canon camera and the guts of an old 8mm film camera. When you throw in a few 3D printed components and some odd electronics, you get an impressive build that captures 8mm film with impressive speed and quality.

This build started with a Canon Ixus 5 camera running CHDK (the Canon Hack Development Kit) to lock the settings down. This points at the film strip through a macro lens so each frame of the strip fills the frame. An Arduino then triggers the camera to take a photo using a USB cable. The same Arduino also controls a motor that winds the film and triggers the film gate from the camera that he salvaged. By reversing the function and triggering it with a servo motor, he can easily blank off the edges of the frame so no stray light shining through the film material causes any problems. Once the camera has captured every frame on the strip, he feeds the captured images into Blender, which processes them and spits out the final movie.

This is a very impressive build overall. [Heikki] has obviously put a lot of thought into it, and the whole thing looks like it runs very efficiently and quickly. The captured video looks great, as you can see from this sample. The decision to use a salvaged film gate was a smart one: there is no point in reinventing the wheel if engineers of previous generations have solved the problem. Kudos to [Heikki] for also documenting the process in a lot of detail: he has produced a 5-part series on his blog that shows how and why he made the decisions he did. This series goes over the overall view of the project, using CHDK to control the camera, 3D printing parts, wiring the Arduino and writing the code that controls the system.

This sits nicely alongside the 8mm to video camera hack that we wrote about recently. This one doesn’t involve taking apart the camera (except for the sacrificial one that supplied the gate), and you still get that wonderfully grainy, jumpy look of 8mm film.

Continue reading “Home Made 8mm Digitizer”

Retrotechtacular: Eidophor, an Unknown Widely Used Projector

If you own a video projector, be it a module small enough to fit in a mobile phone or one designed for a cinema screen, the chances are it will have a DLP at its heart. An array of microscopic mirrors on an integrated circuit, the current state of the art in video projection technology.

Perhaps you own an older video projector, or maybe a cheaper new one. If so the chances are it’ll have a small LCD screen doing its work, taking the place of the Kodachrome in something very similar to your grandparents’ slide projector or their grandparents’ magic lantern.

eidophore-patent-image-600pxLCD technology was invented in the 1970s, while DLP was invented at the end of the 1980s. So how did the video projectors that were such a staple of televised spectaculars in the preceding decades work? For that matter, how did NASA project their status displays on the huge screen at Mission Control? Certainly not with CRT technology, even the brightest CRT projectors weren’t up to filling a cinema-sized screen.

The answer came from the Eidophor (Greek: ‘eido’ and ‘phor’, ‘image’ and ‘bearer’), a device invented in the years before World War II by the Swiss physicist Dr. Fritz Fischer and granted a US patent in 1945. It featured a complex vacuum device in which an electron gun painted the video frames as a raster on an oil-covered mirror in the light path of a fairly conventional projector. High-voltage electric charges have the effect of deforming the surface of mineral oils, and it was this effect that was exploited to vary the effectiveness of the mirror as the raster was drawn. An unfortunate side-effect of tracing an oil surface with an electron beam is that a charge will build up on the oil surface, so the entire oil-covered mirror assembly had to rotate within its vacuum enclosure and pass under an electrode which removed any charge build-up.

Eidophor-wikipedia
Eidophor [by Topquark2 CC-BY-SA 3.0]
The resulting machine as seen in this 1952 issue of Popular Science was very large, complex, and expensive to run, but delivered by far the brightest and sharpest projected video available. In a literal sense they painted the backdrop to our culture, as they found a home not only in NASA’s control room but in television studios and at large televised events. This Shirley Bassey performance from the 1960s for example, or the spectacular video light show on this rather poor quality VHS YouTube clip from Seville Expo 1992.

You will probably be unaware of the exact date you last saw an eidophor performance. Quince Imaging tell us their last one was used at the TWA Dome in St Louis in July 2000. Eidophores may have become more compact over the decades but they remained costly to run, and through the 1990s they were suplanted by DLP devices that did substantially the same job with a lot less fuss.

It is not often that a search in the Hackaday archives for a technology returns no results, but the eidophor is one of those cases. Perhaps that is a fitting epitaph for a device that created its own show but never starred in it, that it is only its spectacular performances that live on.

Adventures in Small Screen Video

[Kevin] wanted to make something using a small CRT, maybe an oscilloscope clock or something similar. He thought he scored big with a portable black and white TV that someone threw away, but it wouldn’t power on. Once opened, he thought he found the culprit—a couple of crusty, popped capacitors. [Kevin] ordered some new ones and played with the Arduino TVout code while he waited.

The caps arrived, but the little TV still wouldn’t chooch. Closer inspection revealed that someone had been there before him and ripped out some JST-connected components. Undaunted, [Kevin] went looking for a new CRT and found a vintage JVC camcorder viewfinder on the electronic bay with a 1-1/8″ screen.

At this point, he knew he wanted to display the time, date, and temperature. He figured out how the viewfinder CRT is wired, correctly assuming that the lone shielded wire is meant for composite video. It worked, but the image was backwards and off-center. No problem, just a matter of tracing out the horizontal and vertical deflection wires, swapping the horizontal ones, and nudging a few pixels in the code. Now he just has to spin a PCB, build an enclosure, and roll his own font.

[Kevin]’s CRT is pretty small, but it’s got to be easier on the eyes than the tiniest video game system.