Hand Gestures Play Tetris

There are reports of a Tetris movie with a sizable budget, and with it come a plentiful amount of questions about how that would work. Who would the characters be? What kind of lines would there be to clear? Whatever the answers, we can all still play the classic game in the meantime. And, thanks to some of the engineering students at Cornell, we could play it without using a controller.

This hack comes from [Bruce Land]’s FPGA design course. The group’s game uses a video camera which outputs a standard NTSC signal and also does some filtering to detect the user. From there, the user can move their hands to different regions of the screen, which controls the movement of the Tetris pieces. This information is sent across GPIO to another FPGA which uses that to then play the game.

This game is done entirely in hardware, making it rather unique. All game dynamics including block generation, movement, and boundary conditions are set in hardware and all of the skin recognition is done in hardware as well. Be sure to check out the video of the students playing the game, and if you’re really into hand gesture-driven fun, you aren’t just limited to Tetris, you can also drive a car.

Continue reading “Hand Gestures Play Tetris”

Hand Gestures Drive Car

There are a number of ways to control an automobile without using the pedals, and sometimes even without using the steering wheel. Most commonly these alternative control mechanisms are installed in vehicles whose owners are disabled in some way, but [Anurag] has taken this idea of alternative control one step further. He has built a car that can be driven by hand gestures alone.

On a remote controlled car, a Raspberry Pi 2 was installed that handles processing and communication. A wireless network is created on the Pi, and a laptop connects to the Pi over the network. The web camera on the laptop regularly captures frames at 15 fps to check for the driver’s hand gestures. The image is converted to gray scale, thresholded, contours are obtained, and the centroid and farthest points are obtained.

After some calculations are done, a movement decision is taken. The decision is passed to the Pi, which in turn, passed that to the internal chip of the car. All of the code is available on the project’s github page. [Anurag] hopes that this can be scaled up to full sized cars in the future. We’ve seen gesture-based remote controls before that rely on Sonar sensors, so it’s interesting to see one that relies strictly on image processing.

Continue reading “Hand Gestures Drive Car”

Impedance Tomography is the new X-Ray Machine

Seeing what’s going on inside a human body is pretty difficult. Unless you’re Superman and you have X-ray vision, you’ll need a large, expensive piece of medical equipment. And even then, X-rays are harmful part of the electromagnetic spectrum. Rather than using a large machine or questionable Kryptonian ionizing radiation vision, there’s another option now: electrical impedance tomography.

[Chris Harrison] and the rest of a research team at Carnegie Mellon University have come up with a way to use electrical excitation to view internal impedance cross-sections of an arm. While this doesn’t have the resolution of an X-ray or CT, there’s still a large amount of information that can be gathered from using this method. Different structures in the body, like bone, will have a different impedance than muscle or other tissues. Even flexed muscle changes its impedance from its resting state, and the team have used their sensor as proof-of-concept for hand gesture recognition.

This device is small, low power, and low-cost, and we could easily see it being the “next thing” in smart watch features. Gesture recognition at this level would open up a whole world of possibilities, especially if you don’t have to rely on any non-wearable hardware like ultrasound or LIDAR.

BitDrones are Awesome, Ridiculous at Same Time

At first we thought it was awesome, then we thought it was ridiculous, and now we’re pretty much settled on “ridiculawesome”.

Bitdrones is a prototype of a human-computer interaction that uses tiny quadcopters as pixels in a 3D immersive display. That’s the super-cool part. “PixelDrones” have an LED on top. “ShapeDrones” have a gauzy cage that get illuminated by color LEDs, making them into life-size color voxels. (Cool!) Finally, a “DisplayDrone” has a touchscreen mounted to it. A computer tracks each drone’s location in the room, and they work together to create a walk-in 3D “display”. So far, so awesome.

It gets even better. Because the program that commands the drones knows where each drone is, it can tell when you’ve moved a drone around in space. That’s extremely cool, and opens up the platform to new interactions. And the DisplayDrone is like a tiny flying cellphone, so you can chat hands-free with your friends who hover around your room. Check out the video embedded below the break.

Continue reading “BitDrones are Awesome, Ridiculous at Same Time”

Controlling a Quadcopter with Gestures

[grassjelly] has been hard at work building a wearable device that uses gestures to control quadcopter motion. The goal of the project is to design a controller that allows the user to intuitively control the motion of a quadcopter. Based on the demonstration video below, we’d say they hit the nail on the head. The controller runs off an Arduino Pro Mini-5v powered by two small coin cell batteries. It contains an accelerometer and an ultrasonic distance sensor.

The controller allows the quadcopter to mimic the orientation of the user’s hand. The user holds their hand out in front of them, parallel to the floor. When the hand is tilted in any direction, the quadcopter copies the motion and will tilt the same way. The amount of pitch and roll is limited by software, likely preventing the user from over-correcting and crashing the machine. The user can also raise or lower their hand to control the altitude of the copter.

[grassjelly] has made all of the code and schematics available via github.

Gesture control uses WiFi doppler shift

wifi-gesture-control

We’ve said it before: in the future simple interfaces will use nothing but your body. At least at first glance that’s the case with this WiFi-based gesture control system. If you have Internet at home you probably have a WiFi access point. That’s the first portion of the equation. The remainder is a way of measuring how the radio waves bounce off of your body. So far this is being done with Software-Define Radio (SDR) but researchers at University of Washington think it may be possible to build the technique into future WiFi devices.

The demo video shows this man waving his arm to adjust the volume of his home entertainment system. Intuition tells us that this would be impossible if your arm wasn’t the only thing in motion at the time. But that issue is quickly addressed. Multiple antennas can track multiple people at the same time. There is also consideration for false-positives. The system requires a moderately complex wake-up gesture sequence to prevent you from, say, accidentally turning on the stereo when you roll over in bed.

If you’re having trouble wraping your mind around this, consider this ultrasonic music player. The WiFi version does the same thing, but processing changes in the returning radio waves is much more complex.

Continue reading “Gesture control uses WiFi doppler shift”