Microsoft shows off their transparent 3D desktop prototype

We think most would agree that the Microsoft Kinect is a miraculous piece of hardware. The affordable availability of a high-quality depth camera was the genesis of a myriad of hacks. And now it seems that type of data is making an intriguing 3D display possible.

What you see above is a 3D monitor concept that Microsoft developed. It starts off looking much like a tablet PC, but the screen can be lifted up toward the user whose arms reach around it to get at the keyboard underneath. There is as depth camera that can see the hands and fingers of the user to allow manipulation of the virtual environment. But that’s only part of the problem. You need some way to align the user’s eyes with what’s on the screen. They seem to have solved that problem too, using another depth camera to track the location of the user’s head. This means that you can lean from one side to the other and the perspective of the virtual 3D desktop will change to preserve the apparent distance of each object.

Don’t miss the show-and-tell video after the break. As long as there’s only one viewer this looks like a perfect non-glasses alternative to current 3D hardware offerings. Continue reading “Microsoft shows off their transparent 3D desktop prototype”

Make any photo 3D using The Gimp

Put your face close to the screen and cross your eyes until the two images above become one. You may need to adjust the tilt of your chin to make it happen, but when they come together you’ll see [John Lennon] pop out in 3D. This was made using a 3D rendering script for The Gimp.

The process is not entirely automatic, but it won’t take too long to mask off the outlines for different depth layers. The script makes three different layers from the image. One of them is a color-coded depth map that uses a custom color palatte to choose distance for each item. If you paint the background dark blue it will be processed at the furthest distance from the viewer’s cross-eyed perspective, yellow is the nearest.

[Don] mentions a parallel output and a cross-eyed output in his write up. We understand the cross-eyed version, but are just guessing that the parallel version would be used in a stereoscopic viewer that puts a partition between the two images so that each eye sees a different frame. You know, like a View-Master.

Cheap DIY laser scanner is quite impressive

With the introduction of the Kinect, obtaining a 3D representation of a room or object became a much easier task than it had been in the past. If you lack the necessary cash for one however, you have to get creative. Both the techniques and technologies behind 3D scanning are somewhat complicated, though certainly still within reach as maker [Shikai Chen] shows us. (Google Translation)

He wanted to create 3D scanned images, but he didn’t have the resources to purchase a Kinect. Instead, he built his own scanner for about 1/6th the cost. Interestingly enough, the scanner resembles what you might imagine a very early Kinect prototype looked like, though it functions just a little bit differently than Microsoft’s creation. The scanner lacks any sort of IR emitter/camera combo, opting to use a laser and a USB VGA camera instead. While scanning, the laser shines across the target surface, and the reflected light is then picked up by the camera.

So how does this $25 DIY laser scanner measure up? Great, to be honest. Check out the video below to see how well his scanner works, and be sure to take a look through his second writeup (Google Translation) as well for more details on the project.

[via Seeedstudio]

EagleUp pulls your PCBs into SketchUp

[Karl] wrote in to tell us about a software package called EagleUp that will import your Eagle CAD PCB designs into Google SketchUp. It bridges the gap between the two using the open source image processing software ImageMagick.

As you can see above, you’ll end up with a beautifully rendered 3D model of your hardware. This is a wonderful way to make sure that your enclosure designs are going to work without needing to wait for the PCBs to arrive from the fab house. It is available for Windows, OSX and Linux (although the last time we tried to run Sketchup under Wine nothing good came of it — perhaps it’s time to try again).

In [Karl’s] case, he’s working on an Arduino compatible board based around the Xmega. He mentions that EagleUp is a great way to get an idea of how component placement will end up, and to see if the silk screen layer is going to turn out well or not. Here’s a link to one of his test designs.

Amazing 3d telepresence system

encumberance_free_telepresence_kinect

It looks like the world of Kinect hacks is about to get a bit more interesting.

While many of the Kinect-based projects we see use one or two units, this 3D telepresence system developed by UNC Chapel Hill student [Andrew Maimone] under the guidance of [Henry Fuchs] has them all beat.

The setup uses up to four Kinect sensors in a single endpoint, capturing images from various angles before they are processed using GPU-accelerated filters. The video captured by the cameras is processed in a series of steps, filling holes and adjusting colors to create a mesh image. Once the video streams have been processed, they are overlaid with one another to form a complete 3D image.

The result is an awesome real-time 3D rendering of the subject and surrounding room that reminds us of this papercraft costume. The 3D video can be viewed at a remote station which uses a Kinect sensor to track your eye movements, altering the video feed’s perspective accordingly. The telepresence system also offers the ability to add in non-existent objects, making it a great tool for remote technology demonstrations and the like.

Check out the video below to see a thorough walkthrough of this 3D telepresence system.

Continue reading “Amazing 3d telepresence system”

Real-time digital puppetry

digital_puppet_show

If it sometimes seems that there is only a finite amount of things you can do with your kids, have you ever considered making movies? We don’t mean taking home videos – we’re talking about making actual movies where your kids can orchestrate the action and be the indirect stars of the show.

Maker [Friedrich Kirchner] has been working on an application called MovieSandbox, which is an open-source realtime animation tool. A couple of years in the making, the project is cross-platform compatible on both Windows and Apple computers (with Linux in the works), making it accessible to just about everyone.

His most recent example of the software’s power is a simple digital puppet show, which is sure to please young and old alike. Using sock puppets fitted with special flex sensors, he is able to control his on-screen cartoon characters by simply moving his puppets’ “mouths”. An Arduino is used to pass the sensor data to his software, while also allowing him to dynamically switch camera angles with a series of buttons.

Obviously something like this requires a bit of configuration in advance, but given a bit of time we imagine it would be pretty easy to set up a digital puppet stage that will keep your kids happily occupied for hours on end.

Continue reading to see a quick video of his sock puppet theater in action.

[via Make]

Continue reading “Real-time digital puppetry”