More Kinect holograms from [programming4fun]

[programing4fun] has been playing around with his Kinect-based 3D display and building a holographic WALL-E controllable with a Windows phone. It’s a ‘kid safe’ version of his Terminator personal assistant that has voice control and support for 3d anaglyph and shutter glasses.

When we saw [programming4fun]‘s Kinect hologram setup last summer we were blown away. By tracking a user’s head with a Kinect, [programming] was able to display a 3D image using only a projector. This build was adapted into a 3D multitouch table and real life portals, so we’re glad to see [programming4fun] refining his code and coming up with some really neat builds.

In addition to robotic avatars catering to your every wish, [programming4fun] also put together a rudimentary helicopter flight simulator controlled by tilting cell phone. It’s the same DirectX 9 heli from [programming]‘s original build. with the addition of Desert Strike-esque top-down graphics. This might be the future of gaming here, so we’ll keep our eyes out for similar head-tracking 3D builds.

As always, videos after the break.

[Read more...]

Making real-life portals with a Kinect

[radicade] wanted to know what real life portals would look like; not something out of a game, but actual blue and orange portals on his living room wall. Short of building a portal gun, the only option available to [radicade] was simulating a pair of portals with a Kinect and a projector.

One of the more interesting properties of portals is the ability to see through to the other side – you can look through the blue portal and see the world from the orange portal’s vantage point. [radicade] simulated the perspective of a portal using the head-tracking capabilities of a Kinect.

The Kinect grabs the depth map of a room, and calculates what peering through a portal would look like. This virtual scene is projected onto a wall behind the Kinect, creating the illusion of real-life orange and blue portals.

We’ve seen this kind of pseudo-3D, head tracking display before (1, 2), so it’s no surprise the 3D illusion of portals would carry over to a projected 3D display. You can check out [radicade]‘s portal demo video after the break.

[Read more...]

Sandbox topographical play gets a big resolution boost

Here’s another virtual sandbox meets real sandbox project. A team at UC Davis is behind this depth-mapped and digitally projected sandbox environment. The physical sandbox uses fine-grained sand which serves nicely as a projection surface as well as a building medium. It includes a Kinect depth camera over head, and an offset digital projector to add the virtual layer. As you dig or build elevation in parts of the box, the depth camera changes the projected view to match in real-time. As you can see after the break, this starts with topographical data, but can also include enhancements like the water feature seen above.

It’s a big step forward in resolution compared to the project from which the team took inspiration. We have already seen this concept used as an interactive game. But we wonder about the potential of using this to quickly generate natural environments for digital gameplay. Just build up your topography in sand, jump into the video game and make sure it’s got the attributes you want, then start adding in trees and structures.

Don’t miss the video demo embedded after the break.

[Read more...]

ATiny powered Kinect fire cannons for dance Fx

[Paul] is at it again with some kinect controlled fire poofers. You may remember [Paul's] previous shenanigans with the gigantic hand made hydraulic flame-sailed pirate ship.  This time he is building a small flame poofer (possibly a series of poofers) for SOAK, a regional (unaffiliated) Burning Man style festival in Oregon.

Any one who remembers the build will recognize the brains of the new cannons, they are just the pirate ship’s custom ATiny board unceremoniously torn from their previous home and recycled for the new controller. This time though they have Kinect! The build seems to function much like the evil genius simulator by simply using a height threshold to activate each cannon, but [Paul] has plans for the new system. This hardware test uses the closed source OpenNI but will meet its full potential when it is reborn in SkelTrack, which was just released a few weeks ago. The cannons are going to go around a small single person dance floor, presumably with the Kinect nearby.

Check out the brief test video after the jump.

[Read more...]

Multitouch table uses a Kinect for a 3D display

[Bastian] sent in a coffee table he built. This isn’t a place to set your drinks and copies of Make, though: it’s a multitouch table with a 3D display. Since no description can do this table justice, take a look at the video.

The build was inspired by the subject of this Hackaday post where [programming4fun] was able to build a ‘holographic display’ using a regular 2D projector and a Kinect. Both builds work on the principle of redrawing the 3D space in relation to the user’s head – as [Bastian] moves his head around the coffee table, the Kinect tracks his location and moves the 3 dimensional grid of boxes in the opposite direction. It’s extremely clever, and looks to be a promising user interface.

In addition to a Kinect, the coffee table uses a Microsoft Surface-like display; four infrared lasers are placed at the corner and detected with a camera next to the projector in the base.

After the break you can see the demo video and a gallery of the images [Bastion] put up on the NUI group forum.

[Read more...]

Adding new features and controlling a Kinect from a couch

Upon the release of the Kinect, Microsoft showed off its golden child as the beginnings of a revolution in user interface technology. The skeleton and motion detection promised a futuristic, hand-waving “Minority Report-style” interface where your entire body controls a computer. The expectations haven’t exactly lived up reality, but [Steve], along with his coworkers at Amulet Devices have vastly improved the Kinect’s skeleton recognition so people can use a Kinect sitting down.

One huge drawback for using the Kinect for a Minority Report UI in a home theater is the fact that the Microsoft Skeleton recognition doesn’t work well when sitting down. Instead of relying on the built-in skeleton recognition that comes with the Kinect, [Steve] rolled his own skeleton detection using Harr classifiers.

Detecting Harr-like features has been used in many applications of computer vision technology; it’s a great, not-very-computationally-intensive way to detect faces and body positions with a simple camera. Training is required for the software, and [Steve]‘s app spent several days programming itself. The results were worth it, though: the Kinect now recognizes [Steve] waving his arm while he is lying down on the couch.

Not to outdo himself, [Steve] also threw in voice recognition to his Kinect home theater controller; a fitting  addition as his employer makes a voice recognition remote control. The recognition software seems to work very well, even with the wistful Scottish accent [Steve] has honed over a lifetime.

[Steve]‘s employer is giving away their improved Kinect software that works for both the Xbox and Windows Kinects. If you’re ever going to do something with a Kinect that isn’t provided with the SDKs and APIs we covered earlier today, this will surely be an invaluable resource.

You can check out [Steve]‘s demo of the new Kinect software after the break.

[Read more...]