Controlling This Smartwatch is All in the Wrist

Smartwatches are pretty great. In theory, you’ll never miss a notification or a phone call. Plus, they can do all kinds of bio-metric tracking since they’re strapped to one of your body’s pulse points. But there are downsides. One of the major ones is that you end up needing two hands to do things that are easily one-handed on a phone. Now, you could use the tip of your nose like I do in the winter when I have mittens on, but that’s not good for your eyes. It seems that the future of smartwatch input is not in available appendages, but in gesture detection.

Enter WristWhirl, the brain-child of Dartmouth and University of Manitoba students [Jun Gong], [Xing-Dong Yang], and [Pourang Irani]. They have built a prototype smartwatch that uses continuous wrist movements detected by IR proximity sensors to control popular off-the-shelf applications. Twelve pairs of dirt-cheap IR sensors connected to an Arduino Due detect any of eight simple gestures made by the wearer to do tasks like opening the calendar, controlling a music player, panning and zooming a map, and playing games like Tetris and Fruit Ninja. In order to save battery, a piezo senses pinch between the user’s thumb and forefinger and uses this input to decide when to start and stop gesture detection.

According to their paper (PDF warning), the gesture detection is 93.8% accurate. To get this data, the team had their test subjects perform each of the eight gestures under different conditions such as walking vs. standing and doing either with the wrist in watch-viewing position or hanging down at their side. Why not gesture your way past the break to watch a demo?

If you’re stuck on the idea of playing Tetris with gestures, there are other ways.

Continue reading “Controlling This Smartwatch is All in the Wrist”

ARM-Based Gesture Remote Control

When we wave our hands at the TV, it doesn’t do anything. You can change that, though, with an ARM processor and a handful of sensors. You can see a video of the project in action below. [Samuele Jackson], [Tue Tran], and [Carden Bagwell] used a gesture sensor, a SONAR sensor, an IR LED, and an IR receiver along with an mBed-enabled ARM processor to do the job.

The receiver allows the device to load IR commands from an existing remote so that the gesture remote will work with most setups. The mBed libraries handle communication with the sensors and the universal remote function. It also provides a simple real-time operating system. That leaves just some simple logic in main.cpp, which is under 250 lines of source code.

Continue reading “ARM-Based Gesture Remote Control”

Finger recognition on the Kinect

The Kinect is awesome, but if you want to do anything at a higher resolution detecting a person’s limbs, you’re out of luck. [Chris McCormick] over at CogniMem has a great solution to this problem: use a neural network on a chip to recognize fingers with hardware already connected to your XBox.

The build uses the very cool CogniMem CM1K neural network on a chip trained to tell the difference between counting from one to four on a single hand, as well as an ‘a-okay’ sign, Vulcan greeting (shown above), and rocking out at a [Dio] concert. As [Chris] shows us in the video, these finger gestures can be used to draw on a screen and move objects using only an open palm and closed fist; not too far off from the Minority Report and Iron Man UIs.

If you’d like to duplicate this build, we found the CM1K neural network chip available here for a bit more than we’d be willing to pay. A neural net on a chip is an exceedingly cool device, but it looks like this build will have to wait for the Kinect 2 to make it down to the consumer and hobbyist arena.

You can check out the videos of Kinect finger recognition in action after the break with World of Goo and Google Maps.

Continue reading “Finger recognition on the Kinect”