Glimpses Of A 3D Volumetric Display

Custom displays are a lot of fun to look at, but this one is something we’d expect to see at a trade show and not on someone’s kitchen table. [Taha Bintahir] built a 3D volumetric display and is showing it off in the image above using a 3DS file of the Superman logo exported from Autodesk. In the video after the break you can see that the display is a transparent pyramid which allows a viewer to see the 3D object inside from any viewpoint around the display. Since first posting about it he has also added a Kinect to the mix, allowing a user to control the 3D object with body movements.

There’s basically no information about the display hardware on [Taha’s] post so we asked him about it. It works by first taking a 3D model and rendering it from four different camera angles. He’s using a custom designed prism for he display and the initial renderings are distorted to match that prism’s dimension. Those renderings are projected on the prism to give the illusion of a 3D object floating at its center.

We’re hoping to hear more details about how this was designed and what hardware is being used. We’ll post a follow-up if [Taha] shares more information.

Continue reading “Glimpses Of A 3D Volumetric Display”

Drill-based Kinect Camera

[Brett Graham] and [David Cox] are taking the Kinect out into the world thanks to this handheld hack they call the Drill of Depth. Apparently, the Kinect wants 12V at 1A which is quite easy to provide with a rechargeable power tool like this Ryobi drill. The setup features a 4.3″ touchscreen display, connected to the Gumstix Overo Air that is running Linux. They claim that there’s a “legitimate scientific reason” for building the device but they’re not sharing it yet.

So what would you use this for? We wonder if it would be possible to roll a GPS into the mix, then use post processing from the captured data to recreate the environment in a virtual setting? Imagine if a weekend spent walking around campus and processing the results let you model your University and make it an add-on level for your favorite game. Or perhaps this could be paired with a regular camera to generate high-quality 3D skinning data for Google Earth. That’s what we came up with, what do you think?

PR2 + Kinect

Willow Garage, the makers of the PR2 robot have been playing with the Kinect. You might be a little tired of seeing every little new project people are doing with it, but there’s something here we couldn’t help but point out. When we posted the video of the guy doing 3d rendering with the Kinect, many of the commenters were speculating on how to get full environments into the computer. Those of you that said, “just use two, facing each other” seem to have been on to something. You can see that they are doing exactly that in the image above. The blue point cloud is one Kinect, the red cloud another.   The Willow Garage crew are using this to do telemetry through the PR2 as well as some gestural controls.  You can download the Openkinect stack for the Robot Operating System here. Be sure to check out the video after the break to see the PR2 being controlled via the Kinect as well as some nice demonstrations of how the Kinect is seeing the environment.

[via BotJunkie]

Continue reading “PR2 + Kinect”

LED Wall And Kinect Join Forces

[Alex] wrote in to let us know about this Kinect controlled LED wall that was whipped up at the Tetalab hackerspace in Toulouse, France. The wall, which was built earlier in the year, uses some MAX7313 LED intensity controlling shift registers. Each gets its own board and controls the intensity of sixteen different red LEDs. They’re embedded in the wall module and covered with ping-pong balls as diffusers.

The recent activity on the project takes advantage of the Xbox Kinect. As you can see in the video after the break, they’ve used the open source Kinect drivers to capture 3D environment data, processing it into color gradients which are displayed on the Pong wall. Shouldn’t be long before they someone comes knocking on their door to install this in a dance club. We love the effect, especially because it works in a dark room and the LEDs don’t cause any interference with the video capture.

Continue reading “LED Wall And Kinect Join Forces”

Rendering A 3D Environment From Kinect Video

[Oliver Kreylos] is using an Xbox Kinect to render 3D environments from real-time video. In other words, he takes the video feed from the Kinect and runs it through some C++ software he wrote to index the pixels in a 3D space that can be manipulated as it plays back. The image above is the result of the Kinect recording video by looking at [Oliver] from his right side. He’s moved the viewer’s playback perspective to be above and in front of him. Part of his body is missing and there is a black shadow because the camera cannot see these areas from its perspective. This is very similar to the real-time 3D scanning we’ve seen in the past, but the hardware and software combination make this a snap to reproduce. Get the source code from his page linked at the top and don’t miss his demo video after the break.

Continue reading “Rendering A 3D Environment From Kinect Video”

Kinect And TISCH Combined For Multitouch

[Florian] sent a link to his proof of concept in creating a multitouch display using the Kinect. He’s the one behind the libTISCH multitouch package and that’s what he used to get this working along with the recently released Kinect drivers. He did this on an Ubuntu machine and, although it’s not a turnkey solution he was kind enough to share some rough directions on accomplishing it yourself. Join us after the break for his instructions and some embedded video.

Continue reading “Kinect And TISCH Combined For Multitouch”

Open Source Kinect Contest Has Been Won

Adafruit Technologies has announced the winner of the Open Source Kinect contest. [Hector], who we mentioned yesterday has won, providing both RGB and depth access to the device.  Some of you were asking at that time, why the contest was not over yet. Well, Adafruit had to verify. The image you see above are of another user[qdot], verifying the drivers on his machine.

What is interesting is how Adafruit has chosen to close this contest. Not only are they giving [Hector] his prize money, they are also donating an additional $2,000 to the EFF who fight for our right to legally hack and reverse engineer our own equipment.

[Hector] is being generous as well, using his prize money to help pay for gadgets to hack with some teams he is involved with, mainly the iPhone Dev Team and the Wii hacker team “Twiizers”