This new video about [Pranav Mistry’s] SixthSense project doesn’t bring us much that we haven’t seen before. At least, not on that project. What really caught our eye was the device he shows off at the beginning of the video. Using two old ball mice, he constructed a grip style input device. It is simple and elegant and we can definitely see using this in future hacks. Not only is it cheap and apparently effective, it seems as though it could be constructed in a very short amount of time. all you need are the wheels that spin when the ball moves, 4 springs and some string. Why didn’t we think of that?
We’re filing this one under “best interface implementation”. This robot is controlled by finger gestures on the surface of an iPod Touch. It can walk forward, turn, sidestep, jump, and kick a ball based on the input it receives from your sweaty digits. Unlike vehicles controlled by an iPhone (or by Power Wheels), this has some potential. Especially considering the inevitable proliferation of multi-touch devices in our everyday lives.
[Benjamin] submitted this slick project. It’s a gesture based control unit for the ipod and iphone. It plugs into the dock port and allows you to control the track and volume with simple gestures. While accellerometer equipped units can already “shake to shuffle”, they lack the ability to simply skip tracks forward or backward. He notes that with an accellerometer, simple gestures can be harder to decipher than with a gyro. The gyro gives the ability to tell which direction you are twisting it, so it’s easier to utilize. [Benjamin] was previously covered when he released the iPodGPS.
Some grad students at Duke University have been working on a new tool for cell phones equipped with accellerometers. The software called Phonepoint Pen, allows you to write with your phone in the air. Though we don’t find the applications they mention very practical, we could see this being very nice for application navigation. If you could program a 3 dimentional gesture to load certain apps, that would be nice.
[floe] wrote in to tell us about his multitouch based thesis work. While many projects have focused on the hardware side of multitouch, TISCH is designed to promote the software side. TISCH is a multiplatform library that features hardware abstraction and gesture recognition. This takes a lot of weight off of widget developers since they can specify known library gestures instead of writing the exact motions from scratch. Using TISCH also means a standard set of gestures across multiple widgets, so the learning curve will be much easier when a user tries out a new app. If you’re researching multitouch, check out this project and help improve the codebase.