[Daniel] wrote in to show us the project his group has been working on. It is a massive display wall consisting of 28 projectors and 30 computers. With a resolution of 7168×3072, viewing a 13.3 gigapixel image is a treat. That treat is made even stronger by the fact that navigating the image is done multitouch style with a touchless system built from web cams. We’ve seen lots of projects come out of the NUI group with similar interfaces, but none that used the webcams like this. Usually, the webcam is detecting some kind of interaction between the person and an infra red light source. Maybe that is happening here and we just don’t see it.
We just found this great portable multitouch rig called the portatouch. Made by a user at the NUI group website named [portatouch], this system uses a stripped down LCD as the display with IR LEDs edge lighting a touch surface in front of it. A camera mounted below the LCD picks up the reflections of the LEDs and converts it to touch points. While the implementation isn’t anything new, the package is really great. If you want to learn how to set up the technical side of it all, head over to the NUI group website and you’ll find all you want. We would love to see a more detailed breakdown of his rig though. The portability and quick construction are fantastic and seem like they could be reproduced without a ton of custom work.
Subcycles is a sound controller application that [Christian] is using on the third multitouch display that he built. The screen is a sheet of acrylic in an aluminum frame. The image is rear projected onto an area covered with Digiline dispersion film. As with other projects that use the Community Core Vision package, a PS3 eye camera captures the touch information.
This build does a great job of including the audience in what the musician on stage is doing. [Chris] points out that the sight of artists staring at laptops on stage is becoming more and more common. The ‘Minority Report’-like interface that Subcycles uses makes not just for interesting music, but for an added visual reinforcement to the live part of the performance.
tbeta is a new tool developed by the NUI Group Community. tbeta acts as an image processing layer to take in image data and output tracking data for multitouch applications. Whether FTIR or DI, scratch built multitouch systems generate IR video streams that need to be processed to find fingertips. tbeta can take this or any arbitrary video stream and run it through a series of filters to generate the touch data. This data is sent as OSC TUIO, a standard protocol for touch events. Along with the camera and input switcher, tbeta also aids in system calibration. I works on Windows, OSX, and Linux. Have a look at the getting started guide for a better idea of how it works.
[Christopher Jette] did a amazing job converting a 56″ rear projection television into a multitouch display. His original inspiration came from this drafting table project. The screen is a large sheet of 1/2″ acrylic with a screen material attached to the back side. The screen edge is surrounded by 168 IR LEDs. When a finger tip touches the surface it scatters the LEDs’ IR light. A webcam sees this scattered light and determines where the fingers are. Inside the box is a standard video projector. This is a great reuse of old equipment and we love to see a hobbyist making up ground where manufacturers aren’t. For more info on multitouch projects, we suggest the Natural User Interface Group. Here’s a video of [Christopher]‘s display in action: