BiDi Screen, On (and Off) Screen Multitouch

[youtube http://www.youtube.com/watch?v=kXuxK6IeQfo&feature=player_embedded%5D

MIT is debuting their latest advancement in technology, a multitouch screen that also functions as a gestural interface. The multitouch aspect is nothing new, the team explains how traditional interfaces using LEDs or camera systems do work, but fail to recognize gestures off-screen.

Gestures are a relatively recent highlight with the introduction of projects like Natal or perspective tracking, but fail to work at closer distances to the screen. MIT has done what seems the impossible by combining and modifying the two to produce the first ever multitouch close proximity gestural display.

And to think, just a couple of months ago the same school was playing with pop-up books.

[via Engadget]

Pranav Mistry’s Cool Input Devices

[ted id=685]

This new video about [Pranav Mistry’s] SixthSense project doesn’t bring us much that we haven’t seen before. At least, not on that project. What really caught our eye was the device he shows off at the beginning of the video. Using two old ball mice, he constructed a grip style input device. It is simple and elegant and we can definitely see using this in future hacks. Not only is it cheap and apparently effective, it seems as though it could be constructed in a very short amount of time. all you need are the wheels that spin when the ball moves, 4 springs and some string. Why didn’t we think of that?

[thanks Sean]

Robot Interface Lets Fingers Do The Walking

[youtube=http://www.youtube.com/watch?v=945Z2xtdEBE]

We’re filing this one under “best interface implementation”. This robot is controlled by finger gestures on the surface of an iPod Touch. It can walk forward, turn, sidestep, jump, and kick a ball based on the input it receives from your sweaty digits. Unlike vehicles controlled by an iPhone (or by Power Wheels), this has some potential. Especially considering the inevitable proliferation of multi-touch devices in our everyday lives.

IPodGyro

[youtube=http://www.youtube.com/watch?v=DG8khSe5gBQ&feature=player_embedded]

[Benjamin] submitted this slick project. It’s a gesture based control unit for the ipod and iphone. It plugs into the dock port and allows you to control the track and volume with simple gestures. While accellerometer equipped units can already “shake to shuffle”, they lack the ability to simply skip tracks forward or backward. He notes that with an accellerometer, simple gestures can be harder to decipher than with a gyro. The gyro gives the ability to tell which direction you are twisting it, so it’s easier to utilize. [Benjamin] was previously covered when he released the iPodGPS.

PhonePoint Pen

[youtube=http://www.youtube.com/watch?v=Nvu2hwMFkMs&feature=player_embedded]

Some grad students at Duke University have been working on a new tool for cell phones equipped with accellerometers. The software called  Phonepoint Pen, allows you to write with your phone in the air. Though we don’t find the applications they mention very practical, we could see this being very nice for application navigation. If you could program a 3 dimentional gesture to load certain apps, that would be nice.

TISCH, Multitouch Framework

multitouch

[floe] wrote in to tell us about his multitouch based thesis work. While many projects have focused on the hardware side of multitouch, TISCH is designed to promote the software side. TISCH is a multiplatform library that features hardware abstraction and gesture recognition. This takes a lot of weight off of widget developers since they can specify known library gestures instead of writing the exact motions from scratch. Using TISCH also means a standard set of gestures across multiple widgets, so the learning curve will be much easier when a user tries out a new app. If you’re researching multitouch, check out this project and help improve the codebase.