[Chris Harrison] and [Scott E. Hudson] have built a novel system for faking a 3D video chat session. Their implementation separates the image of the chat participant from the background. They then dynamically reposition the video based on the movement of the viewers head. Their using the OpenCV library to do facial recognition (just like the Laughing Man demo). The 3D effect is very similar to what you see in [Johnny Lee]’s Wiimote headtracking. A video of the pseudo 3D chat is embedded below.
Continue reading “Pseudo 3D chat”
Lately we’ve been focusing on multitouch technologies, but that doesn’t mean there isn’t interesting research going on in other areas of human-computer interaction. [Johnny Lee] posted a roundup of some the work that [Gonzalo Ramos] and others have done with pen based input. The video embedded above shows how pressure can be used to increase control precision. Have a look at his post to see how pen gestures can be used for seamless workspace sharing and how pen rolling can give additional control.
The Evolution Control Committee has been doing live mashup performances for many years and recently upgraded their hardware. Inspired by [Johnny Lee]’s Wiimote whiteboard, they built a rear projection display they could use during performances. It displays a dense collection of samples in Ableton Live. On each of the performer’s hands is an IR LED mounted to a thimble. By touching the thumb to the forefinger, the LED turns on. Two Wiimotes watch for these IR flashes to trigger mouse clicks. [TradeMark G] found the Ableton display too complex to navigate quickly and accurately with a mouse; this new display make things much easier and enjoyable.
[via Laughing Squid]
There is no doubt that [Johnny Lee] is the authority on Wiimote based projects. So, when he compiles a list of his favorite Wiimote projects, we definitely pay attention. He’s organized the list as a progression of the unusual. By the time you get to ‘Chicken Head Tracking‘ at the bottom, you’ll be adequately prepared. You’re bound to get some inspiration from the list even it’s building a pigeon guided missile.
[Johnny Lee]’s colleague [Paul Dietz] has done some interesting work using interactive tables. He’s specifically researched how to determine how full a drink glass is. In the video above, he’s using Microsoft’s Surface, but this technique should work with any IR camera based multitouch table. Determining the drink level requires custom glassware that has a small prism inside. When the liquid level is above the prism, light passes through, but when it’s below the top it reflects more IR light back into the table. Using this information, restaurant staff could serve drinks in a more efficient manner.
[Paul] has worked on another project that uses RFID and capacitive sensing to a similar effect.