Laptop LCD Reused In Beagleboard Project

This daughterboard lets [Matt Evans] drive a laptop LCD using a Beagleboard. Apparently the Beagleboard gained a VGA header when it moved to revision C but [Matt’s] working with revision B4 which is why he had to do all of that ninja soldering with the blue wires. The driver board itself is a thing of beauty, hosting a DS90C363 LVDS serialiser as well as some buffer chips that handle level conversion for it. He’s also included an ATmega48 so that he has some options for future improvements.

The LCD is mounted in a custom acrylic case, with Beagleboard and driver board taped to the back of it. There’s RS232 and a USB hub which opens up the possibility of using a WiFi dongle for communications. So far he doesn’t have much functionality other than displaying images on the screen but there is some talk about using a touchpad for control. We’d love to see a touchscreen overlay, transforming the build into a proper ARM-based tablet.

Projector Tricks Make Use Of Kinect 3D Mapping

[Don’t stop the clock] is doing some work with a projector, a camera, and the Kinect. What he’s accomplished is quite impressive, combining the three to manipulate light with your body. The image above is a safer rendition of the Hadouken from the Street Fighter video games, throwing light across the room instead of fire. This comes at the end of the video after the break, but first he’ll show off the core features of the system. You can hold up your hand and wave it to turn it into a light source. In other words, the projector will shine light on your hand, moving it, and manipulating the intensity based on hand location in 3D space. Since the Kinect is sending fairly precise data back to the computer the projected image is trimmed to match your hand and arm without overflowing onto the rest of the room until you touch your hand to a surface you want illuminated or throw the light source with a flick or the wrist. It may seem trivial at first glance, but we find the alignment of the projector and the speed at which the image updates to be quite impressive.

Continue reading “Projector Tricks Make Use Of Kinect 3D Mapping”

Glimpses Of A 3D Volumetric Display

Custom displays are a lot of fun to look at, but this one is something we’d expect to see at a trade show and not on someone’s kitchen table. [Taha Bintahir] built a 3D volumetric display and is showing it off in the image above using a 3DS file of the Superman logo exported from Autodesk. In the video after the break you can see that the display is a transparent pyramid which allows a viewer to see the 3D object inside from any viewpoint around the display. Since first posting about it he has also added a Kinect to the mix, allowing a user to control the 3D object with body movements.

There’s basically no information about the display hardware on [Taha’s] post so we asked him about it. It works by first taking a 3D model and rendering it from four different camera angles. He’s using a custom designed prism for he display and the initial renderings are distorted to match that prism’s dimension. Those renderings are projected on the prism to give the illusion of a 3D object floating at its center.

We’re hoping to hear more details about how this was designed and what hardware is being used. We’ll post a follow-up if [Taha] shares more information.

Continue reading “Glimpses Of A 3D Volumetric Display”

BendDesk Multi-touch Furniture

The BendDesk is a horizontal and a vertical multi-touch display connected as one curved surface. Think of it as a smart white-board and a multi-touch desk all in one. It can be used to sort and edit information, or to play games. Check out “Bend Invaders”, a game demonstrated in the video after the break. When you touch two fingers to the display the two points are used to aim a laser at the oncoming monsters.

The system uses a combination of two projectors shining on the surface from underneath and behind. A series of LEDs around the edges of the display bathe it in infrared light. Three cameras with IR filters peer at the underside of the acrylic surface and detect touches by distinguishing variances in the IR pattern through a process called Frustrated Total Internal Reflection. If you’re interested in more of the math and science involved there are a couple of papers available from the project site linked at the top of this post.

We’ve seen so many displays using the Kinect lately, it’s refreshing to see one that doesn’t.

Continue reading “BendDesk Multi-touch Furniture”

Projector Introduces Augmented Reality To Reality

[Raj Sodhi] and [Brett Jones] have been working on interactive augmented reality as part of their research at the University of Illinois. What they have come up with is a stylus-based input system that can use physical objects to create a virtual landscape. Above you can see that an environment was built using white blocks. A camera maps a virtual world that matches the physical design. From there an infrared stylus can be used to manipulate virtual data which is projected on the blocks.

What they’ve created is a very advanced IR Whiteboard. There are buttons on the stylus, one of which opens the menu, made up of circles that you can see above. From there, you can select a tool and make it do your bidding. After the break there’s a video demonstration where a game is set up, using the menu to place tanks and mines on the 3D playing field. We wonder how hard it would be to do this using a projector and a Kinect.

Continue reading “Projector Introduces Augmented Reality To Reality”

Kinect And TISCH Combined For Multitouch

[Florian] sent a link to his proof of concept in creating a multitouch display using the Kinect. He’s the one behind the libTISCH multitouch package and that’s what he used to get this working along with the recently released Kinect drivers. He did this on an Ubuntu machine and, although it’s not a turnkey solution he was kind enough to share some rough directions on accomplishing it yourself. Join us after the break for his instructions and some embedded video.

Continue reading “Kinect And TISCH Combined For Multitouch”

Data Plotting For The Visually Impaired

This setup helps to represent data in a meaningful way to for visually impaired people. It uses a combination of physical objects to represent data clusters, and audio feedback when manipulating those objects. In the video after the break you’ll see that the cubes can orient themselves to represent data clusters. The table top acts as a graphing field, with a textured border as a reference for the user. A camera mounted below the clear surface allows image processing software to calculate the locations for the cubes. Each cube is motorized and contains an Arduino and ZigBee module, listening for positioning information from the computer that is doing the video processing. Once in position, the user can move the cubes, with modulated noise as a measure of how near they are to the heart of each data cluster.

The team plans to conduct further study on the usefulness of this interactive data object. We certainly see potential for hacking as this uses off-the-shelf components that are both inexpensive, and easy to find. It certainly reminds us of a multitouch display with added physical tokens.

Continue reading “Data Plotting For The Visually Impaired”