Millimeter Wave RADAR Tracks Gestures

If we believe science fiction — from Minority Report to Iron Man, to TekWar — the future of computer interfaces belongs to gestures. There are many ways to read gestures, although often they require some sort of glove or IR emitter, which makes them less handy (no pun intended).

Some, like the Leap Motion, have not proved popular for a variety of reasons. Soli (From Google’s Advanced Technology and Projects group) is a gesture sensor that uses millimeter-wave RADAR. The device emits a broad radio beam and then collects information including return time, energy, and frequency shift to gain an understanding about the position and movement of objects in the field. You can see a video about the device, below.

You naturally think of using optical technology to look at hand gestures (the same way humans do). However, RADAR has some advantages. It is insensitive to light and can transmit through plastic materials, for example. The Soli system operates at 60 GHz, with sensors that use Frequency Modulated Continuous Wave (FMCW) and Direct-Sequence Spread Spectrum (DSSS). The inclusion of multiple beamforming antennas means the device has no moving parts.

Clearly, this is cutting-edge gear and not readily available yet. But the good news is that Infineon is slated to bring the sensors to market sometime this year. Planned early applications include a smart watch and a speaker that both respond to gestures using the technology.

Interestingly, the Soli processing stack is supposed to be RADAR agnostic. We haven’t investigated it, but we wonder if you could use the stack to process other kinds of sensor input that might be more hacker friendly? Barring that, we’d love to see what our community could come up with for solving the same problem.

We’ve seen Raspberry Pi daughter-boards (ok, hats) that recognize gestures used to control TVs. We’ve even built some crude gesture sensing using SONAR, if that gives you any ideas. Are you planning on using Soli? Or rolling your own super gesture sensor? Let us know and document your project for everyone over on Hackaday.io.

Continue reading “Millimeter Wave RADAR Tracks Gestures”

RadarCat Gives Computers A Sense Of Touch

So far, humans have had the edge in the ability to identify objects by touch. but not for long. Using Google’s Project Soli, a miniature radar that detects the subtlest of gesture inputs, the [St. Andrews Computer Human Interaction group (SACHI)] at the University of St. Andrews have developed a new platform, named RadarCat, that uses the chip to identify materials, as if by touch.

Realizing that different materials return unique radar signals to the chip, the [SACHI] team combined it with their recognition software and machine learning processes that enables RadarCat to identify a range of materials with accuracy in real time! It can also display additional information about the object, such as nutritional information in the case of food, or product information for consumer electronics. The video displays how RadarCat has already learned an impressive range of materials, and even specific body parts. Can Skynet be far behind?

Continue reading “RadarCat Gives Computers A Sense Of Touch”