UMotio: An Arduino Compatible 3D Gesture Controller

uMotio

The Mooltipass project USB code contributor [Tom] and his friend [Ignatius] recently launched their Indiegogo campaign: meet the 3D gesture controller uMotio (Indiegogo link). As [Tom] has been spending much of his personal time helping the Mooltipass community, we figured that a nice way to thank him would be to try making their great open project one step closer to a disseminated product.

As you can see in the video embedded after the break, the uMotio is a plug and play system (detected as a USB HID joystick & keyboard with a CDC port) that can be used in many different scenarios: gaming, computer control, domotics, music, etc… The platform is based around an ATMega32u4 and the much discussed MGC3130 3D tracking and gesture controller. This allows a 0 to 15cm detection range with a resolution of up to 150dpi. uMotio is Arduino compatible so adapting it to your particular project can be done in no time especially using its dedicated expansion header and libraries. The uMotio blue even integrates an internal Li-ion battery and a Bluetooth Low Energy module.

Continue reading “UMotio: An Arduino Compatible 3D Gesture Controller”

Extracting Gesture Information From Existing Wireless Signals

A team at the University of Washington recently developed Allsee, a simple gesture recognition device composed of very few components. Contrary to conventional Doppler modules (like this one) that emit their own RF signal, Allsee uses already existing wireless signals (TV and RFID transmissions) to extract any movement that may occur in front of it.

Allsee’s receiver circuit uses a simple envelope detector to extract the amplitude information to feed it to a microcontroller Analog to Digital Converter (ADC). Each gesture will therefore produce a semi-unique footprint (see picture above).  The footprint can be analyzed to launch a dedicated action on your computer/cellphone. The PDF article claims that the team achieved a 97% classification accuracy over a set of eight gestures.

Obviously the main advantage of this system is its low power consumption. A nice demonstration video is embedded after the break, and we’d like to think [Korbi] for tipping us about this story.

Continue reading “Extracting Gesture Information From Existing Wireless Signals”

Gesture Recognition Using Ultrasound

SAMSUNG

You’d be hard pressed to find a public restroom that wasn’t packed full of hands free technology these days. From the toilets to the sinks and paper towel dispensers, hands free tech is everywhere in modern public restrooms.

The idea is to cut down on the spread of germs.  However, as we all know too well, this technology is not perfect. We’ve all gone from sink to sink in search of one that actually worked. Most of us have waved our hands wildly in the air to get a paper towel dispenser to dispense, creating new kung-fu moves in the process. IR simply has its limitations.

What if there was a better way? Check out [Ackerley] and [Lydia’s] work on gesture recognition using ultrasound. Such technology is cheap and could easily be implemented in countless applications where hands free control of our world is desired. Indeed, the free market has already been developing this technology for use in smart phones and tablets.

Where a video camera will use upwards of 1 watt of power to record video, an ultrasound device will use only micro watts. IR can still be used to detect gestures, as in this gesture based security lock, but lacks the resolution that can be obtained by ultrasound.  So let us delve deep into the details of [Ackerley] and [Lydia’s] ultrasound version of a gesture recognizer, so that we might understand just how it all works, and you too can implement your own ultrasound gesture recognition system.

Continue reading “Gesture Recognition Using Ultrasound”

3D Gesture Tracking With LIDAR

[Reza] has been working on detecting hand gestures with LIDAR for about 10 years now, and we’ve got to say the end result is worth the wait.

The build uses three small LIDAR sensors to measure the distance to an object. These sensors work by sending out an infrared pulse and recording the time of flight for a beam of light to be emmitted and reflected back to a light sensor. Basically, it’s radar but with infrared light. Three of these LIDAR sensors are mounted on a stand and plugged into an Arduino Uno. By measuring how far away an object is to each sensor, [Reza] can determine the object’s position in 3D space relative to the sensor.

Unlike the Kinect-based gesture applications we’ve seen, [Reza]’s LIDAR can work outside in the sun. Because each LIDAR sensor is measuring the distance a million times a second, it’s also much more responsive than a Kinect as well. Not bad for 10 years worth of work.

You can check out [Reza]’s gesture control demo, as well as a few demos of his LIDAR hardware after the break.

Continue reading “3D Gesture Tracking With LIDAR”

Doppler-effect Lets You Add Gestures To Your Computer

What if you could add gesture recognition to your computer without making any hardware changes? This research project seeks to use computer microphone and speakers to recognize hand gestures. Audio is played over the speakers, with the input from the microphone processed to detect Doppler shift. In this way it can detect your hand movements (or movement of any object that reflects sound).

The sound output is in a range of 22-80 kHz which is not audible to our ears. It does make us wonder if widespread use of this will drive the pet population crazy, or reroute migration paths of wildlife, but that’s research for another day. The system can even be used while audible sounds are also being played, so you don’t lose the ability to listen to music or watch video.

The screen above shows the raw output of the application. But in the video after the break you can see some possible uses. It works for scrolling pages, double-clicking (or double-tapping as it were), and there’s a function that detects the user walking away from the computer and locks the screen automatically.

[Sidhant Gupta] is the researcher who put the video together. In addition to this project (called SoundWave) he’s got several other interesting alternative-input projects on his research page. Continue reading “Doppler-effect Lets You Add Gestures To Your Computer”

Face Tracking In Opera

[youtube=http://www.youtube.com/watch?v=1ioV2Dj56iw]

Inspired by this year’s april fools day joke from Opera, [Jason] has made facial gesture recognition actually work. While this may seem like a silly project, it could seriously help some people out. This could be a great accessibility tool for people with motor control limitations.He states that it has some problems right now, most notably a performance issue with extended use, so he’s hoping to get some input from some bright minds.

[thanks, Jordan]