Millimeter Wave RADAR Tracks Gestures

If we believe science fiction — from Minority Report to Iron Man, to TekWar — the future of computer interfaces belongs to gestures. There are many ways to read gestures, although often they require some sort of glove or IR emitter, which makes them less handy (no pun intended).

Some, like the Leap Motion, have not proved popular for a variety of reasons. Soli (From Google’s Advanced Technology and Projects group) is a gesture sensor that uses millimeter-wave RADAR. The device emits a broad radio beam and then collects information including return time, energy, and frequency shift to gain an understanding about the position and movement of objects in the field. You can see a video about the device, below.

You naturally think of using optical technology to look at hand gestures (the same way humans do). However, RADAR has some advantages. It is insensitive to light and can transmit through plastic materials, for example. The Soli system operates at 60 GHz, with sensors that use Frequency Modulated Continuous Wave (FMCW) and Direct-Sequence Spread Spectrum (DSSS). The inclusion of multiple beamforming antennas means the device has no moving parts.

Clearly, this is cutting-edge gear and not readily available yet. But the good news is that Infineon is slated to bring the sensors to market sometime this year. Planned early applications include a smart watch and a speaker that both respond to gestures using the technology.

Interestingly, the Soli processing stack is supposed to be RADAR agnostic. We haven’t investigated it, but we wonder if you could use the stack to process other kinds of sensor input that might be more hacker friendly? Barring that, we’d love to see what our community could come up with for solving the same problem.

We’ve seen Raspberry Pi daughter-boards (ok, hats) that recognize gestures used to control TVs. We’ve even built some crude gesture sensing using SONAR, if that gives you any ideas. Are you planning on using Soli? Or rolling your own super gesture sensor? Let us know and document your project for everyone over on Hackaday.io.

Continue reading “Millimeter Wave RADAR Tracks Gestures”

Gesture Controlled Quadcopter

Controlling A Quadcopter With Gestures

[grassjelly] has been hard at work building a wearable device that uses gestures to control quadcopter motion. The goal of the project is to design a controller that allows the user to intuitively control the motion of a quadcopter. Based on the demonstration video below, we’d say they hit the nail on the head. The controller runs off an Arduino Pro Mini-5v powered by two small coin cell batteries. It contains an accelerometer and an ultrasonic distance sensor.

The controller allows the quadcopter to mimic the orientation of the user’s hand. The user holds their hand out in front of them, parallel to the floor. When the hand is tilted in any direction, the quadcopter copies the motion and will tilt the same way. The amount of pitch and roll is limited by software, likely preventing the user from over-correcting and crashing the machine. The user can also raise or lower their hand to control the altitude of the copter.

[grassjelly] has made all of the code and schematics available via github.

Doppler-effect Lets You Add Gestures To Your Computer

What if you could add gesture recognition to your computer without making any hardware changes? This research project seeks to use computer microphone and speakers to recognize hand gestures. Audio is played over the speakers, with the input from the microphone processed to detect Doppler shift. In this way it can detect your hand movements (or movement of any object that reflects sound).

The sound output is in a range of 22-80 kHz which is not audible to our ears. It does make us wonder if widespread use of this will drive the pet population crazy, or reroute migration paths of wildlife, but that’s research for another day. The system can even be used while audible sounds are also being played, so you don’t lose the ability to listen to music or watch video.

The screen above shows the raw output of the application. But in the video after the break you can see some possible uses. It works for scrolling pages, double-clicking (or double-tapping as it were), and there’s a function that detects the user walking away from the computer and locks the screen automatically.

[Sidhant Gupta] is the researcher who put the video together. In addition to this project (called SoundWave) he’s got several other interesting alternative-input projects on his research page. Continue reading “Doppler-effect Lets You Add Gestures To Your Computer”

Cloud Mirror Adds Internet To Your Morning Ritual

This mirror has a large monitor behind it which can be operated using hand gestures. It’s the result of a team effort from [Daniel Burnham], [Anuj Patel], and [Sam Bell] to build a web-enabled mirror for their ECE 4180 class at the Georgia Institute of Technology.

So far they’ve implemented four widget for the system. You can see the icons which activate each in the column to the right of the mirror. From top to bottom they are Calendar, News, Traffic, and Weather. The video after the break shows the gestures used to control the display. First select the widget by holding your hand over the appropriate icon. Next, bring that widget to the main display area by swiping from right to left along the top of the mirror.

Hardware details are shared more freely in their presentation slides (PDF). A sonar distance sensor activated the device when a user is close enough to the screen. Seven IR reflectance sensors detect a hand placed in front of them. We like this input method, as it keep the ‘display’ area finger-print free. But we wonder if the IR sensors could be placed behind the glass instead of beside it?

Continue reading “Cloud Mirror Adds Internet To Your Morning Ritual”