Making Real-life Portals With A Kinect

[radicade] wanted to know what real life portals would look like; not something out of a game, but actual blue and orange portals on his living room wall. Short of building a portal gun, the only option available to [radicade] was simulating a pair of portals with a Kinect and a projector.

One of the more interesting properties of portals is the ability to see through to the other side – you can look through the blue portal and see the world from the orange portal’s vantage point. [radicade] simulated the perspective of a portal using the head-tracking capabilities of a Kinect.

The Kinect grabs the depth map of a room, and calculates what peering through a portal would look like. This virtual scene is projected onto a wall behind the Kinect, creating the illusion of real-life orange and blue portals.

We’ve seen this kind of pseudo-3D, head tracking display before (1, 2), so it’s no surprise the 3D illusion of portals would carry over to a projected 3D display. You can check out [radicade]’s portal demo video after the break.

Continue reading “Making Real-life Portals With A Kinect”

3D Gesture Tracking With LIDAR

[Reza] has been working on detecting hand gestures with LIDAR for about 10 years now, and we’ve got to say the end result is worth the wait.

The build uses three small LIDAR sensors to measure the distance to an object. These sensors work by sending out an infrared pulse and recording the time of flight for a beam of light to be emmitted and reflected back to a light sensor. Basically, it’s radar but with infrared light. Three of these LIDAR sensors are mounted on a stand and plugged into an Arduino Uno. By measuring how far away an object is to each sensor, [Reza] can determine the object’s position in 3D space relative to the sensor.

Unlike the Kinect-based gesture applications we’ve seen, [Reza]’s LIDAR can work outside in the sun. Because each LIDAR sensor is measuring the distance a million times a second, it’s also much more responsive than a Kinect as well. Not bad for 10 years worth of work.

You can check out [Reza]’s gesture control demo, as well as a few demos of his LIDAR hardware after the break.

Continue reading “3D Gesture Tracking With LIDAR”

Hackaday Links May 13th 2012

Amazing ass… for a robot

Yep, Japan still has the creepy robotics market cornered. Case in point is this robotic posterior. Don’t worry, they’ve included a dissection so you can see how the insides work too. [via Gizmodo]

Time-lapse camera module results

As promised, [Quinn Dunki] sent in a link to the photo album from her time lapse camera module. In case you missed it, she built it in a Tic Tac container and stuck it to the side of a racecar.

Kinect controlled killbot

Didn’t we learn anything from RoboCop? We could totally see this Kinect controlled robot (which happens to weigh five tons) going out of control and liquefying an unsuspecting movie extra standing near it. [via Dvice]

Laser popping domino balloons

apparently [Scott] has set a world record by using a laser to pop a line of 100 red balloons. We enjoy seeing the size of the 1W laser that does the popping… it can’t be long now before we get a hold of handheld laser pistols. [via Gizmodo]

Laser balloon targeting

If that last one was a bit of a let down, you might enjoy this automatic targeting system more. The blue triangle shaped icon is setting a target, the amber triangles have already been targeted. Once all the balloons are identified a laser quickly zaps each in order. Quite impressive, although no details have been provided. [Thanks everyone who sent in a link to this]

http://gizmodo.com/5909007/we-hope-lasers-popping-hundreds-of-balloons-is-the-new-dominos-fad

Sandbox Topographical Play Gets A Big Resolution Boost

Here’s another virtual sandbox meets real sandbox project. A team at UC Davis is behind this depth-mapped and digitally projected sandbox environment. The physical sandbox uses fine-grained sand which serves nicely as a projection surface as well as a building medium. It includes a Kinect depth camera over head, and an offset digital projector to add the virtual layer. As you dig or build elevation in parts of the box, the depth camera changes the projected view to match in real-time. As you can see after the break, this starts with topographical data, but can also include enhancements like the water feature seen above.

It’s a big step forward in resolution compared to the project from which the team took inspiration. We have already seen this concept used as an interactive game. But we wonder about the potential of using this to quickly generate natural environments for digital gameplay. Just build up your topography in sand, jump into the video game and make sure it’s got the attributes you want, then start adding in trees and structures.

Don’t miss the video demo embedded after the break.

Continue reading “Sandbox Topographical Play Gets A Big Resolution Boost”

Huge Water And Light VU Meter Plus More

This is the senior design project for a group at the University of Vermont. It’s a wet, bubbly, blinky, interactive thing. Each column is a clear tube filled with water, with a string of fully addressable RGB LEDs suspended in the center. In idle mode, the lights scroll through a series of interesting patterns while the water is filled with bubbles to add some depth to the presentation. There is also a VU meter function, as seen here and during the Portal theme song that ends the video demo after the break.

A Teensy++ board is used to address the display. It’s set up to receive serial commands from a Processing script which is responsible for generating the animations. At the top of the frame you can see there’s a Kinect sensor. By standing in the standard post (we think it should be called the Kinect mug shot) the installation will automatically switch over to body control. We could see this thing making its way into a long airplane terminal hallway, following the travelers along their trek from one terminal to the next.

Continue reading “Huge Water And Light VU Meter Plus More”

ATiny Powered Kinect Fire Cannons For Dance Fx

[Paul] is at it again with some kinect controlled fire poofers. You may remember [Paul’s] previous shenanigans with the gigantic hand made hydraulic flame-sailed pirate ship.  This time he is building a small flame poofer (possibly a series of poofers) for SOAK, a regional (unaffiliated) Burning Man style festival in Oregon.

Any one who remembers the build will recognize the brains of the new cannons, they are just the pirate ship’s custom ATiny board unceremoniously torn from their previous home and recycled for the new controller. This time though they have Kinect! The build seems to function much like the evil genius simulator by simply using a height threshold to activate each cannon, but [Paul] has plans for the new system. This hardware test uses the closed source OpenNI but will meet its full potential when it is reborn in SkelTrack, which was just released a few weeks ago. The cannons are going to go around a small single person dance floor, presumably with the Kinect nearby.

Check out the brief test video after the jump.

Continue reading “ATiny Powered Kinect Fire Cannons For Dance Fx”