Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms.

flappy-double

After viral popularity, developer rage quits, and crazy eBay auctions, the world at large is just about done with Flappy Bird. Here at Hackaday, we can’t let it go without showcasing two more hacks. The first is the one that we’ve all been waiting for: a robot that will play the damn game for us. Your eyes don’t deceive you in that title image. The Flappy Bird bot is up to 147 points and going strong. [Shi Xuekun] and [Liu Yang], two hackers from China, have taken full responsibility for this hack. They used OpenCV with a webcam on Ubuntu to determine the position of both the bird and the pipes. Once positions are known, the computer calculates the next move. When it’s time to flap, a signal is sent to an Arduino Mega 2560. The genius of this hack is the actuator. Most servos or motors would have been too slow for this application. [Shi] and [Liu] used the Arduino and a motor driver to activate a hard drive voice coil. The voice coil was fast enough to touch the screen at exactly the right time, but not so powerful as to smash their tablet.

If you would like to make flapping a bit more of a physical affair, [Jérémie] created Flappy Bird with Kinect. He wrote a quick Processing sketch which uses the Microsoft Kinect to look for humans flapping their arms. If flapping is detected, a command is sent to an Android tablet. [Jérémie] initially wanted to use Android Debug Bridge (ADB) to send the touch commands, but found it was too laggy for this sort of hardcore gaming. The workaround is to use a serial connected Arduino as a mouse. The Processing sketch sends a ‘#’ to the Arduino via serial. The Arduino then sends a mouse click to the computer, which is running  hidclient.  Hidclient finally sends Bluetooth mouse clicks to the tablet. Admittedly, this is a bit of a Rube Goldberg approach, but it does add an Arduino to a Flappy Bird hack, which we think is a perfect pairing.

Continue reading “Computers Playing Flappy Bird. Skynet Imminent. Humans Flapping Arms.”

An Android Controlled Arduino Drone

Drone

Who among us has not wanted to create their own drone? [Stefan] wrote in to tell us about a project for high school students, where a Styrofoam glider (translated) is converted into an Android (or PC) controlled drone.

[Stefan] tells us that the inspiration for this project comes from 100 years ago, when “steam-engines were THE thing” and children became introduced to modern technology with toy engines. “Today, mechatronic designs are all around us and this is an attempt to build the equivalent of the toy steam engine.” This project showcases how modern tools make it easy for kids to get involved and excited about hardware hacking, electronics, and software.

At the heart of the glider is an Arduino Pro Mini which communicates with either a computer or an Android phone via Bluetooth. It is especially interesting to note that the student’s used Processing to create the Android app, rather than complicating things by using Eclipse and Android Development Tools (ADT). While the more detailed PDF documentation at the end of the project page is in German, all of the Processingand Arduino code needed to build the project is provided. It would be awesome to see more Bluetooth related projects include a simple Android application; after all, many of us carry computers in our pockets these days, so we might as well put them to good use!

Do you have any well documented projects that introduce young and budding engineers to hardware or software hacking? Let us know in the comment section or send us a tip!

Perfect Jump Shots with OpenCV and Processing

jumpshot

[ElectricSlim] likes taking “Jump Shots” – photographs where the subject is captured in midair. He’s created a novel method to catch the perfect moment with OpenCV and Processing. Anyone who has tried jump shot photography can tell you how frustrating it is. Even with an experienced photographer at the shutter, shots are as likely to miss that perfect moment as they are to catch it. This is even harder when you’re trying to take jump shots solo. Wireless shutter releases can work, but unless you have a DSLR, shutter lag can cause you to miss the mark.

[ElectricSlim] decided to put his programming skills to work on the problem. He wrote a Processing sketch using the OpenCV library. The sketch has a relatively simple logic path: “IF a face is detected within a bounding box AND the face is dropping in height THEN snap a picture” The system isn’t perfect, A person must be looking directly at the camera for the photo the face to be detected. However, it’s good enough to take some great shots. The software is also repeatable enough to make animations of various jump shots, as seen in [ElectricSlim’s] video.

We think this would be a great starting point for a trigger system. Use a webcam to determine when to shoot a picture. When the conditions pass, a trigger could be sent to a DSLR, resulting in a much higher quality frame than what most webcams can produce.

Continue reading “Perfect Jump Shots with OpenCV and Processing”

IMU boards as next-gen motion capture suit?

This guy takes a drink and so does the virtual wooden mannequin. Well, its arm takes a drink because that’s all the researchers implemented during this summer project. But the demo really makes us think that suits full of IMU boards are the next generation of motion capture. Not because this is the first time we’ve seen it (the idea has been floating around for a couple of years) but because the sensor chips have gained incredible precision while dropping to bargain basement prices. We can pretty much thank the smartphone industry for that, right?

Check out the test subject’s wrist. That’s an elastic bandage which holds the board in place. There’s another one on this upper arm that is obscured by his shirt sleeve. The two of these are enough to provide accurate position feedback in order to make the virtual model move. In this case the sensor data is streamed to a computer over Bluetooth where a Processing script maps it to the virtual model. But we’ve seen similar 9-axis sensors in projects like this BeagleBone sensor cape. It makes us think it would be easy to have an embedded system like that on the back of a suit which collects data from sensor boards all over the test subject’s body.

Oh who are we kidding? [James Cameron’s] probably already been using this for years.

Continue reading “IMU boards as next-gen motion capture suit?”

Hackaday Links: Sunday, July 28th, 2013

hackaday-links-chain

[Chris Gammell] tipped us off that he’s building an online training program for learning electronics. The ten session course will cost money to take but you can get the goods for free if you’re one of the beta testers. We love to listen to The Amp Hour podcast which is just one of [Chris’] many endeavors.

Did you buy a Chromecast this week? We did, but we don’t have it in hand yet (ordered through Amazon). You can still get a look inside from the iFixit teardown.

Practice your Processing skills by using it to code a game of Pong.

A bit of lighter fluid and a hacked insert will get you a flaming wallet. We guess this is a different type of an anti-pickpocket device. [Thanks Stephen]

[Brain] used a $1.50 magnifying lens to help his Raspberry Pi camera module read QR codes better.

We really like [Aaron Christophel’s] LED matrix clock (translated). He started from a marquee that must be at least a decade old. He stripped it down and figured out how to drive it using a Sanguino as a controller.

Modeling squid cells in code foregoes connecting voltage to animals

[Kemper Smith] built a little piece of nature in Processing. He was inspired by a biology experiment that excited squid cells using electricity. The result is an interactive display that mimics that biology.

Last August we saw a peculiar experiment that forced Cyprus Hill music on the color changing cells of a squid. The cells make colors by stretching sacs of pigment; the larger they get the more of that color is shown. Normally this is used for camouflage. The image on the left is the reaction from connecting headphone wires while music is being played.

But we can’t all get our hands on this type of wet-ware — especially if life far inland. So [Kemper] got to work writing some Processing code. The result is seen on the right. It does a good job of replicating the motion and color palette of the original. He’s put together a web-based demonstration which you can interact with using your mouse cursor. But we also saw him demonstrate a Kinect based version at our local hackerspace.

Continue reading “Modeling squid cells in code foregoes connecting voltage to animals”