Gesture Control Without Fancy Sensors, Just Pots And Weights

[Dennis] aims to make robotic control a more intuitive affair by ditching joysticks and buttons, and using wireless gesture controls in their place. What’s curious is that there isn’t an accelerometer or gyro anywhere to be seen in his Palm Power! project.

The gesture sensing consists not of a fancy IMU, but of two potentiometers (one for each axis) with offset weights attached to the shafts. When the hand tilts, the weights turn the shafts of the pots, and the resulting readings are turned into motion commands and sent over Bluetooth. The design certainly has a what-you-see-is-what-you-get aspect to it, and as a whole it works much like an inverted, weighted joystick hanging from one’s palm.

It’s an economical way to play with the idea of motion sensing, and when it comes to prototyping, being able to test a concept while keeping costs to a minimum is a good skill to have.

Gesture Control For Lunch Money

[Dimitris Platis] wanted to add gesture control to his PC. You’d think that would be expensive, but by combining a diminutive Arduino, a breakout board with a gesture controller, and an interconnect PCB, he managed to pull it off for about $7. That doesn’t include the optional 3D-printed case and we think you could omit the interconnect board if you don’t mind some wires and further cut costs. [Dimitris] calls it Nevma, and you can see how the device works in the video below.

The heart of the project is a sensor that measures light and motion. The chip and the breakout board are just a couple of bucks if you order them from China. You can find them in the US if you don’t mind spending a little bit more. The device has an I2C interface, and [Dimitris] uses a tiny Mini SS Micro for the USB interface and the CPU.

Continue reading “Gesture Control For Lunch Money”

A Minority Report Arduino-Based Hand Controller

Movies love to show technology they can’t really build yet. Even in 2001: A Space Oddessy (released in 1968), for example, the computer screens were actually projected film.  The tablet they used to watch the news looks like something you could pick up at Best Buy this afternoon. [CircuitDigest] saw Iron Man and that inspired him to see if he could control his PC through gestures as they do on that film and so many others (including Minority Report). Although he calls it “virtual reality,” we think of VR as being visually immersed and this is really just the glove, but it is still cool.

The project uses an Arduino on the glove and Processing on the PC. The PC has a webcam which tracks the hand motion and the glove has two Hall effect sensors to simulate mouse clicks. Bluetooth links the glove and the PC. You can see a video of the thing in action, below.

Continue reading “A Minority Report Arduino-Based Hand Controller”

Millimeter Wave RADAR Tracks Gestures

If we believe science fiction — from Minority Report to Iron Man, to TekWar — the future of computer interfaces belongs to gestures. There are many ways to read gestures, although often they require some sort of glove or IR emitter, which makes them less handy (no pun intended).

Some, like the Leap Motion, have not proved popular for a variety of reasons. Soli (From Google’s Advanced Technology and Projects group) is a gesture sensor that uses millimeter-wave RADAR. The device emits a broad radio beam and then collects information including return time, energy, and frequency shift to gain an understanding about the position and movement of objects in the field. You can see a video about the device, below.

You naturally think of using optical technology to look at hand gestures (the same way humans do). However, RADAR has some advantages. It is insensitive to light and can transmit through plastic materials, for example. The Soli system operates at 60 GHz, with sensors that use Frequency Modulated Continuous Wave (FMCW) and Direct-Sequence Spread Spectrum (DSSS). The inclusion of multiple beamforming antennas means the device has no moving parts.

Clearly, this is cutting-edge gear and not readily available yet. But the good news is that Infineon is slated to bring the sensors to market sometime this year. Planned early applications include a smart watch and a speaker that both respond to gestures using the technology.

Interestingly, the Soli processing stack is supposed to be RADAR agnostic. We haven’t investigated it, but we wonder if you could use the stack to process other kinds of sensor input that might be more hacker friendly? Barring that, we’d love to see what our community could come up with for solving the same problem.

We’ve seen Raspberry Pi daughter-boards (ok, hats) that recognize gestures used to control TVs. We’ve even built some crude gesture sensing using SONAR, if that gives you any ideas. Are you planning on using Soli? Or rolling your own super gesture sensor? Let us know and document your project for everyone over on Hackaday.io.

Continue reading “Millimeter Wave RADAR Tracks Gestures”

Controlling This Smartwatch Is All In The Wrist

Smartwatches are pretty great. In theory, you’ll never miss a notification or a phone call. Plus, they can do all kinds of bio-metric tracking since they’re strapped to one of your body’s pulse points. But there are downsides. One of the major ones is that you end up needing two hands to do things that are easily one-handed on a phone. Now, you could use the tip of your nose like I do in the winter when I have mittens on, but that’s not good for your eyes. It seems that the future of smartwatch input is not in available appendages, but in gesture detection.

Enter WristWhirl, the brain-child of Dartmouth and University of Manitoba students [Jun Gong], [Xing-Dong Yang], and [Pourang Irani]. They have built a prototype smartwatch that uses continuous wrist movements detected by IR proximity sensors to control popular off-the-shelf applications. Twelve pairs of dirt-cheap IR sensors connected to an Arduino Due detect any of eight simple gestures made by the wearer to do tasks like opening the calendar, controlling a music player, panning and zooming a map, and playing games like Tetris and Fruit Ninja. In order to save battery, a piezo senses pinch between the user’s thumb and forefinger and uses this input to decide when to start and stop gesture detection.

According to their paper (PDF warning), the gesture detection is 93.8% accurate. To get this data, the team had their test subjects perform each of the eight gestures under different conditions such as walking vs. standing and doing either with the wrist in watch-viewing position or hanging down at their side. Why not gesture your way past the break to watch a demo?

If you’re stuck on the idea of playing Tetris with gestures, there are other ways.

Continue reading “Controlling This Smartwatch Is All In The Wrist”

ARM-Based Gesture Remote Control

When we wave our hands at the TV, it doesn’t do anything. You can change that, though, with an ARM processor and a handful of sensors. You can see a video of the project in action below. [Samuele Jackson], [Tue Tran], and [Carden Bagwell] used a gesture sensor, a SONAR sensor, an IR LED, and an IR receiver along with an mBed-enabled ARM processor to do the job.

The receiver allows the device to load IR commands from an existing remote so that the gesture remote will work with most setups. The mBed libraries handle communication with the sensors and the universal remote function. It also provides a simple real-time operating system. That leaves just some simple logic in main.cpp, which is under 250 lines of source code.

Continue reading “ARM-Based Gesture Remote Control”

Finger Recognition On The Kinect

The Kinect is awesome, but if you want to do anything at a higher resolution detecting a person’s limbs, you’re out of luck. [Chris McCormick] over at CogniMem has a great solution to this problem: use a neural network on a chip to recognize fingers with hardware already connected to your XBox.

The build uses the very cool CogniMem CM1K neural network on a chip trained to tell the difference between counting from one to four on a single hand, as well as an ‘a-okay’ sign, Vulcan greeting (shown above), and rocking out at a [Dio] concert. As [Chris] shows us in the video, these finger gestures can be used to draw on a screen and move objects using only an open palm and closed fist; not too far off from the Minority Report and Iron Man UIs.

If you’d like to duplicate this build, we found the CM1K neural network chip available here for a bit more than we’d be willing to pay. A neural net on a chip is an exceedingly cool device, but it looks like this build will have to wait for the Kinect 2 to make it down to the consumer and hobbyist arena.

You can check out the videos of Kinect finger recognition in action after the break with World of Goo and Google Maps.

Continue reading “Finger Recognition On The Kinect”