Gesture control uses WiFi doppler shift

wifi-gesture-control

We’ve said it before: in the future simple interfaces will use nothing but your body. At least at first glance that’s the case with this WiFi-based gesture control system. If you have Internet at home you probably have a WiFi access point. That’s the first portion of the equation. The remainder is a way of measuring how the radio waves bounce off of your body. So far this is being done with Software-Define Radio (SDR) but researchers at University of Washington think it may be possible to build the technique into future WiFi devices.

The demo video shows this man waving his arm to adjust the volume of his home entertainment system. Intuition tells us that this would be impossible if your arm wasn’t the only thing in motion at the time. But that issue is quickly addressed. Multiple antennas can track multiple people at the same time. There is also consideration for false-positives. The system requires a moderately complex wake-up gesture sequence to prevent you from, say, accidentally turning on the stereo when you roll over in bed.

If you’re having trouble wraping your mind around this, consider this ultrasonic music player. The WiFi version does the same thing, but processing changes in the returning radio waves is much more complex.

[Read more...]

3D gesture tracking with LIDAR

[Reza] has been working on detecting hand gestures with LIDAR for about 10 years now, and we’ve got to say the end result is worth the wait.

The build uses three small LIDAR sensors to measure the distance to an object. These sensors work by sending out an infrared pulse and recording the time of flight for a beam of light to be emmitted and reflected back to a light sensor. Basically, it’s radar but with infrared light. Three of these LIDAR sensors are mounted on a stand and plugged into an Arduino Uno. By measuring how far away an object is to each sensor, [Reza] can determine the object’s position in 3D space relative to the sensor.

Unlike the Kinect-based gesture applications we’ve seen, [Reza]‘s LIDAR can work outside in the sun. Because each LIDAR sensor is measuring the distance a million times a second, it’s also much more responsive than a Kinect as well. Not bad for 10 years worth of work.

You can check out [Reza]‘s gesture control demo, as well as a few demos of his LIDAR hardware after the break.

[Read more...]

Graffiti Analysis

Here’s a fascinating project that started with a great idea and piled on a remarkable amount of innovation. Graffiti Analysis is a project that captures gestures used to create graffiti art and codifies them through a data-type called Graffiti Markup Language (GML). After the break you can watch a video showing the data capture method used in version 2.0 of the project. A marker taped to a light source is used to draw a graffiti tag on a piece of paper. The paper rests on a plexiglass drawing surface with a webcam tracking the underside in order to capture each motion.

The newest iteration, version 3.0, has some unbelievable features. The addition of audio input means that the markup can be projected and animated based on sound, with the example of graffiti interacting with a fireworks show. The 3D tools are quite amazing too, allowing not only for stereoscopic video playback, but for printing out graffiti markup using a 3D printer. The collection of new features is so vast, and produces such amazing results it’s hard to put into words. So we’ve also embedded the demo of the freshly released version after the break.

[Read more...]

Mister Gloves, gesture input

This two handed glove input setup, by [Sean Chen] and [Evan Levine], is one step closer to achieving that [Tony Stark] like workstation; IE, interacting with software in 3D with simple hand gestures. Dubbed the Mister Gloves, the system incorporates accelerometer, push button, and flex sensor data over RF where an MCU converts it to a standard USB device, meaning no drivers are needed and a windows PC can recognize it as a standard keyboard and mouse. Catch a video of Mister Gloves playing portal after the jump.

While amazing, we’re left wondering if gesture setups are really viable options considering one’s arm(s) surly would get tired? [Read more...]

BiDi Screen, on (and off) screen multitouch

MIT is debuting their latest advancement in technology, a multitouch screen that also functions as a gestural interface. The multitouch aspect is nothing new, the team explains how traditional interfaces using LEDs or camera systems do work, but fail to recognize gestures off-screen.

Gestures are a relatively recent highlight with the introduction of projects like Natal or perspective tracking, but fail to work at closer distances to the screen. MIT has done what seems the impossible by combining and modifying the two to produce the first ever multitouch close proximity gestural display.

And to think, just a couple of months ago the same school was playing with pop-up books.

[via Engadget]

Pranav Mistry’s cool input devices

http://ted.com/talks/view/id/685

This new video about [Pranav Mistry's] SixthSense project doesn’t bring us much that we haven’t seen before. At least, not on that project. What really caught our eye was the device he shows off at the beginning of the video. Using two old ball mice, he constructed a grip style input device. It is simple and elegant and we can definitely see using this in future hacks. Not only is it cheap and apparently effective, it seems as though it could be constructed in a very short amount of time. all you need are the wheels that spin when the ball moves, 4 springs and some string. Why didn’t we think of that?

[thanks Sean]