FPGA plays Mario like a champ

fpga-controls-mario-bros

This isn’t an FPGA emulating Mario Bros., it’s an FPGA playing the game by analyzing the video and sending controller commands. It’s a final project for an engineering course. The ECE5760 Advanced FPGA course over at Cornell University that always provides entertainment for us every time the final projects are due.

Developed by team members [Jeremy Blum], [Jason Wright], and [Sima Mitra], the video parsing is a hack. To get things working they converted the NES’s 240p video signal to VGA. This resulted in a rolling frame show in the demo video. It also messes with the aspect ratio and causes a few other headaches but the FPGA still manages to interpret the image correctly.

Look closely at the screen capture above and you’ll see some stuff that shouldn’t be there. The team developed a set of tests used to determine obstacles in Mario’s way. The red lines signify blocks he will have to jump over. This also works for pits that he needs to avoid, with a different set of tests to detect moving enemies. Once it knows what to do the FPGA emulates the controller signals necessary, pushing them to the vintage gaming console to see him safely to the end of the first level.

We think this is more hard-core than some other autonomous Mario playing hacks just because it patches into the original console hardware instead of using an emulator.

Continue reading “FPGA plays Mario like a champ”

Virtual chess uses glove controllers

chess-using-glove-controllers

Check out the game of chess going on above. It’s a virtual game where each player uses a glove as the controller. Or course the game board and pieces are missing from this image. They’re displayed on a computer monitor which both players can see.

The hardware rather simple, and we think it would be a great project to challenge your microcontroller skills. Each glove has an accelerometer attached to it, as well as a ring of copper foil on the pointer finger and thumb. One ATmega1284 monitors both gloves. The accelerometer data is used to move the mouse cursor on the screen, while the contacts are used to grip or release a playing piece. The game board and pieces are displayed using MATLAB with controller commands fed to it via a USB connection.

If you’re more into building a mechanized game check out this pair of telepresence chess boards.

Continue reading “Virtual chess uses glove controllers”

Voice controlled video game uses “Biu” and “ahh” for control

voice-controlled-gaming

This video game gives your thumbs a rest while stretching those vocal chords. The pair of microphones seen above control the video game on the LCD display. Saying “Biu” will launch a projectile while “ahh” adjusts the flight path. The system was developed by [Tian Gao] as a final project for his ECE 4760 course at Cornell University.

The inputs are common computer microphones connected to some processing circuitry which he built on a piece of protoboard. This consists of some RC filtering and an LM358 opamp to get the signal ready for use with the ATmega1284. There is only one ADC on that chip so [Tian] alternates sampling from the microphones by using the multiplexer built into the chip. The video signal itself is an NTSC composite signal. To facilitate a reasonable frame rate he uses graphics that are packed in multiples of 8-bits. All in all this allows him to create a 160×200 pixel display.

All of this makes the game sound a little dry, but we dare you to listen to the video clip after the break without cracking a smile.

Continue reading “Voice controlled video game uses “Biu” and “ahh” for control”

Cornell ECE 4760 lecture videos now online

Whenever we hear about ECE 4760 we take notice. That’s because a ton of fantastic hacked together projects have resulted from the class. It’s offered at Cornell University and focuses on designing projects based on microcontrollers. We look at it as a ‘how to connect everything to your microcontroller’ guide. The good news for you is that 34 lecture videos from the Spring 2012 ECE 4760 class are now available to watch for free online. When coupled with the course webpage itself (which outlines the reading, labs, and homework) this turns into an opportunity to work through the entire course on your own schedule.

If you need a brief preview, here’s a couple random things we’ve seen as final projects from the course: a digital saxophone, a handwriting decoder, and a haptic feedback unit for building your biceps.

We’re still working our way through the Nand2Tetris project, but we’re putting these lectures on our watch list for later.

[via Reddit]

Lazy Labor day educational time. Watch Cornell’s microcontroller courses.

 

C’mon, you know you’re not really going to do much today. You might as well spend that time learning some skills instead of watching funny cats. The Cornell ECE lectures on microcontrollers (ECE 4760 and ECE5760), taught by [Bruce Land], are available online for free.

Not only do you get to enjoy these two courses, but there are videos available showing off several different categories of student projects as well.

Continue reading “Lazy Labor day educational time. Watch Cornell’s microcontroller courses.”

KMODDL: A mechanism maker’s dream site.

Computers are relatively new still, but we’ve had mechanics for a very long time. KMODDL  keeps us from reinventing the wheel. It contains collections of mechanisms with descriptions, pictures, and even videos. We were working on a arbalest design not too long ago, and we were having trouble coming up with a clever ratchet design for one of the parts. We spent a few moments in KMODDL looking through the ratchet section of the Reuleaux collection, and  soon after we had the basic building blocks of our design. Sure there are books you could buy that do a similar thing, but KMODDL is completely free, very in depth, and easier to search. Plus, with a useful tool like this you might not even have to take apart all your appliances anymore to see how they work. My first sewing machine might have lived a longer life had I seen this first. Anyone know of more resources like this?

An Autonomous Car Using a “Webcam”

This autonomous remote control-style car from Cornell students was designed for a senior level engineering course there. It’s main “sensor” is a low-res webcam style camera. As shown in the video after the break, this car does quite well staying within two black lines on a white surface using it’s vision processing. It also has an IR sensor to detect objects in front of the car and stop without crashing.

All “vision” computations are handled by an Atmel Mega644 MCU, an 8-bit processor. Because of the processing limits of this chip, much work had to be done to make this process computationally efficient. These students go through an incredibly detailed account of their project, focusing on the code and electrical design. Check out the video of their car in action after the break. Continue reading “An Autonomous Car Using a “Webcam””