[Jeremy Blum] aimed to be the brightest student at his Master’s graduation ceremony this spring. He designed an LED rig for his mortar board which should battle the sun’s intensity by using up to 21 watts of power. But he didn’t stop with eye-catching intensity. while he was at it he also included some interactive features so the guy behind him has a way to keep from going blind.
One thing that really caught our eye is the 3D printed parts he generated for the project. There’s a nice mounting plate for the LED side of things, and a wrist-mounted enclosure for the Raspberry Pi board. Wait, why does he need an RPi to drive some LEDs? We already mention interactivity which is facilitated by the Pi acting as a WiFi hotspot. Connect to the access point and choose a color. If you’re in the seat behind [Jeremy] you’ll want to choose black! All of this and is explained in his video presentation.
Continue reading “LED Mortar Board Battles Suns Brightness With 21W Of Power”
This isn’t an FPGA emulating Mario Bros., it’s an FPGA playing the game by analyzing the video and sending controller commands. It’s a final project for an engineering course. The ECE5760 Advanced FPGA course over at Cornell University that always provides entertainment for us every time the final projects are due.
Developed by team members [Jeremy Blum], [Jason Wright], and [Sima Mitra], the video parsing is a hack. To get things working they converted the NES’s 240p video signal to VGA. This resulted in a rolling frame show in the demo video. It also messes with the aspect ratio and causes a few other headaches but the FPGA still manages to interpret the image correctly.
Look closely at the screen capture above and you’ll see some stuff that shouldn’t be there. The team developed a set of tests used to determine obstacles in Mario’s way. The red lines signify blocks he will have to jump over. This also works for pits that he needs to avoid, with a different set of tests to detect moving enemies. Once it knows what to do the FPGA emulates the controller signals necessary, pushing them to the vintage gaming console to see him safely to the end of the first level.
We think this is more hard-core than some other autonomous Mario playing hacks just because it patches into the original console hardware instead of using an emulator.
Continue reading “FPGA Plays Mario Like A Champ”
[Jeremy Blum] wrote in to share his LibeTech QR Code Door Lock project. He developed it during his Senior year at Cornell University along with three of his classmates. It seeks to move away from magnetic card locks in favor of optical locks that authenticate based on a QR code.
The hardware he’s using here is definitely cost prohibitive, but we’re sure the concept could be greatly simplified. In this case a BeagleBone running embedded Linux monitors a feed from a webcam. When it detects a QR code it compares it with a database of approved keys and will unlock the door for you.
There are problems with this technique, one being that an attacker might be able to get a usable photograph of your key without you knowing. But the majority of hotel locks in use right now are even less secure than that. On the upside, the key to your room can be emailed to you for use on just about any device with a screen, or printed out on a piece of paper.
You can find [Jeremy’s] presentation video embedded after the break.
Continue reading “QR Code Opens Doors To You”
[Jeremy Blum] and [Jason Wright] pose with their project at the end of a 24 hour hackathon. The Facebook headquarters in New York City held the event as part of their Summer of Hack program. As an homage to the hosts, the hacking duo decided to create a physical book and populate it with the virtual Facebook. And what do you call such a creation? The Face(book)^2.
The video after the break gives the best overview of the hardware, but here’s the gist of it: They started with the largest hardcover book they could find, hollowing out its pages to house their own hardware. When you open the book it calls back to a computer over an Xbee link with a request for data. The python script on the computer pulls the newest from a Facebook feed, sending it back to the book to be displayed. There is a graphic LCD and four character LCDs built in for this purpose. There’s also an accelerometer which is used for detecting page turns when the cover is jostled. The rest of the interactivity is provided by a few tactile switches mounted next to the smaller LCD screens for navigation and the ‘like’ feature.
Continue reading “Hackathon Results In The Facebook Book”
[Jason Wright] and [Jeremy Blum] are showing off the project they developed for their Designing with Microcontrollers course at Cornell University. They call it the Heliowatcher, and if you know your Greek mythology we’d be you figured out this watches the movement of the sun and adjust a solar panel to follow it.
Their design is simple and effective. The base is mounted like a Lazy Susan, able to pivot on the horizontal plane. The bottom edge of the solar panel is mounted with two door hinges, with a motorized screw jack used to raise and lower it. The system uses a GPS to provide geographical position, day, and time feedback. This is used in conjunction with an array of four LEDs to determine the best position of the panel. Those LEDs are acting as light sensors; when the top and the bottom detect similar levels, the panel is at its most efficient orientation. The left and right LED sensors work the same way.
Now if we can just work out a self-cleaning system to keep the panels free of the dirty film that builds up over time we’d be set!
Continue reading “Heliowatcher Positions Solar Panels For Highest Efficiency”
[Easton] as been working with [Jeremy Blum] to come up with the newest version of his animatronic hand. You may remember seeing [Easton’s] first animatronic hand, with which he won his regional science fair and made a trip to nations. Since then he’s been working on improvements, and with access to [Jeremy’s] Makerbot he harnessed the power of open source design to make his own printed hand, extending a different Thingiverse project.
He’s still using the original sensor glove as a controller. It sends commands to the Arduino controlling the arm via an Xbee module. From there, five servos inside a fiberglass forearm move each finger and the thumb. The video clip after the break gives [Easton] a chance to show off all of the new design features, and finishes with a demonstration of the hand grasping different objects. We had a chance to chat with him briefly. He’s got big goals for himself, aiming to design a prosthetic arm for under $1000. That’s not a career goal… he’d like to get it done this year.
Continue reading “[Easton’s] Animatronic Hand Gets 3D Printed Upgrade”
[Jeremy Blum] recently finished writing a couple of software packages for his SudoGlove system that turns it into a music controller with a lot of features. We’ve seen the hardware in a previous post and as a goal for this iteration he decided not to alter the hardware or the firmware controlling it whatsoever–making this a PC-side software only hack. It’s nice to see improvement on the original ideas as we feel most of the glove-based projects we’ve covered end up getting thrown in the junk box after the developer’s interest wanes.
After the break you can see and hear a demonstration of the complete system. The front end of application shown
was written using Processing and includes a slew of user configurations for each sensor on the glove itself. Under the hood [Jeremy] built on the PureData framework in order to really unlock the potential for translating physical movement into synthesized sound. There is also a visual feedback application which will help you practice your movements, important if you’re giving live performances where each finger is a different instrument. Everything for this project, both hardware and software, has been released under a CC license so check out [Jeremy’s] site if you’re interested in building on part or all of the good work he’s done.
Update: [Jeremy] wrote in with a bit of a correction for our synopsis. The application shown in the video is written entirely in PureData and the visual debugger was written with Processing. The two are standalone packages that don’t depend on each other. He also sent us a link to download the code packages.
Continue reading “SudoGlove Gets A Big Software Upgrade”