More Kinect Holograms From [programming4fun]

[programing4fun] has been playing around with his Kinect-based 3D display and building a holographic WALL-E controllable with a Windows phone. It’s a ‘kid safe’ version of his Terminator personal assistant that has voice control and support for 3d anaglyph and shutter glasses.

When we saw [programming4fun]’s Kinect hologram setup last summer we were blown away. By tracking a user’s head with a Kinect, [programming] was able to display a 3D image using only a projector. This build was adapted into a 3D multitouch table and real life portals, so we’re glad to see [programming4fun] refining his code and coming up with some really neat builds.

In addition to robotic avatars catering to your every wish, [programming4fun] also put together a rudimentary helicopter flight simulator controlled by tilting cell phone. It’s the same DirectX 9 heli from [programming]’s original build. with the addition of Desert Strike-esque top-down graphics. This might be the future of gaming here, so we’ll keep our eyes out for similar head-tracking 3D builds.

As always, videos after the break.

Continue reading “More Kinect Holograms From [programming4fun]”

Hackit: Leap Motions New Motion Sensor

The big news circulating this morning is of the Leap Motion sensor that will be hitting the market soon. They claim that their sensor is ~100 more accurate than anything else on the market right now. Check out the video to see what they mean (there’s another at the link). This is pretty impressive.  You can see the accuracy as they use a stylus to write things. If you’ve played with the Kinect, you know that it is nowhere near this tight. Of course, the Kinect is scanning a massive space compared to the work area that this new sensor works in.  The response time looks very impressive as well, motions seem to be perfectly in sync with the input. We’re excited to play with one when we get a chance.

Continue reading “Hackit: Leap Motions New Motion Sensor”

Making Real-life Portals With A Kinect

[radicade] wanted to know what real life portals would look like; not something out of a game, but actual blue and orange portals on his living room wall. Short of building a portal gun, the only option available to [radicade] was simulating a pair of portals with a Kinect and a projector.

One of the more interesting properties of portals is the ability to see through to the other side – you can look through the blue portal and see the world from the orange portal’s vantage point. [radicade] simulated the perspective of a portal using the head-tracking capabilities of a Kinect.

The Kinect grabs the depth map of a room, and calculates what peering through a portal would look like. This virtual scene is projected onto a wall behind the Kinect, creating the illusion of real-life orange and blue portals.

We’ve seen this kind of pseudo-3D, head tracking display before (1, 2), so it’s no surprise the 3D illusion of portals would carry over to a projected 3D display. You can check out [radicade]’s portal demo video after the break.

Continue reading “Making Real-life Portals With A Kinect”

3D Gesture Tracking With LIDAR

[Reza] has been working on detecting hand gestures with LIDAR for about 10 years now, and we’ve got to say the end result is worth the wait.

The build uses three small LIDAR sensors to measure the distance to an object. These sensors work by sending out an infrared pulse and recording the time of flight for a beam of light to be emmitted and reflected back to a light sensor. Basically, it’s radar but with infrared light. Three of these LIDAR sensors are mounted on a stand and plugged into an Arduino Uno. By measuring how far away an object is to each sensor, [Reza] can determine the object’s position in 3D space relative to the sensor.

Unlike the Kinect-based gesture applications we’ve seen, [Reza]’s LIDAR can work outside in the sun. Because each LIDAR sensor is measuring the distance a million times a second, it’s also much more responsive than a Kinect as well. Not bad for 10 years worth of work.

You can check out [Reza]’s gesture control demo, as well as a few demos of his LIDAR hardware after the break.

Continue reading “3D Gesture Tracking With LIDAR”

Hackaday Links May 13th 2012

Amazing ass… for a robot

Yep, Japan still has the creepy robotics market cornered. Case in point is this robotic posterior. Don’t worry, they’ve included a dissection so you can see how the insides work too. [via Gizmodo]

Time-lapse camera module results

As promised, [Quinn Dunki] sent in a link to the photo album from her time lapse camera module. In case you missed it, she built it in a Tic Tac container and stuck it to the side of a racecar.

Kinect controlled killbot

Didn’t we learn anything from RoboCop? We could totally see this Kinect controlled robot (which happens to weigh five tons) going out of control and liquefying an unsuspecting movie extra standing near it. [via Dvice]

Laser popping domino balloons

apparently [Scott] has set a world record by using a laser to pop a line of 100 red balloons. We enjoy seeing the size of the 1W laser that does the popping… it can’t be long now before we get a hold of handheld laser pistols. [via Gizmodo]

Laser balloon targeting

If that last one was a bit of a let down, you might enjoy this automatic targeting system more. The blue triangle shaped icon is setting a target, the amber triangles have already been targeted. Once all the balloons are identified a laser quickly zaps each in order. Quite impressive, although no details have been provided. [Thanks everyone who sent in a link to this]

http://gizmodo.com/5909007/we-hope-lasers-popping-hundreds-of-balloons-is-the-new-dominos-fad

Sandbox Topographical Play Gets A Big Resolution Boost

Here’s another virtual sandbox meets real sandbox project. A team at UC Davis is behind this depth-mapped and digitally projected sandbox environment. The physical sandbox uses fine-grained sand which serves nicely as a projection surface as well as a building medium. It includes a Kinect depth camera over head, and an offset digital projector to add the virtual layer. As you dig or build elevation in parts of the box, the depth camera changes the projected view to match in real-time. As you can see after the break, this starts with topographical data, but can also include enhancements like the water feature seen above.

It’s a big step forward in resolution compared to the project from which the team took inspiration. We have already seen this concept used as an interactive game. But we wonder about the potential of using this to quickly generate natural environments for digital gameplay. Just build up your topography in sand, jump into the video game and make sure it’s got the attributes you want, then start adding in trees and structures.

Don’t miss the video demo embedded after the break.

Continue reading “Sandbox Topographical Play Gets A Big Resolution Boost”