Super Refined Kinect Physics Demo

kinect_demo

Since the Kinect has become so popular among hackers, [Brad Simpson] over at IDEO Labs finally purchased one for their office and immediately got to tinkering. In about 2-3 hours time, he put together a pretty cool physics demo showing off some of the Kinect’s abilities.

Rather than using rough skeleton measurements like most hacks we have seen, he paid careful attention to the software side of things. Starting off using the Kinect’s full resolution (something not everybody does) [Brad] took the data and manipulated it quite a bit before creating the video embedded below. Skeleton data was collected and run through several iterations of a smoothing algorithm to substantially reduce the noise surrounding the resulting outline.

The final product is quite a bit different than the Kinect videos we are used to seeing, and it drastically improves how the user is able to interact with virtual objects added to the environment. As you may have noticed, the blocks that were added to the video never rarely penetrate the outline of the individual in the movie. This isn’t due to some sort of digital trickery – [Brad] was able to prevent the intersection of different objects via his tweaking of the Kinect data feed.

We’re not sure how much computing power this whole setup requires, but the code is available from their Google Code repository, so we hope to see other projects refined by utilizing the techniques shown off here.

[via KinectHacks]

Continue reading “Super Refined Kinect Physics Demo”

Controlling Weapons With Kinect

kinect_rocket_launcher

It was only a matter of time before someone would figure out how to weaponize their Kinect. Hacker [Jonas Wagner] was fiddling with his Kinect one day and thought that it would be cool to launch missiles simply by gesturing. Not having any real missiles on hand, he settled for controlling a USB-powered foam missile launcher instead.

He mounted a webcam to the top of his rocket launcher to record video of his victims, and with a bit of Python along with the libfreenect ilbrary he was well on his way to world cubicle dominance.  The Kinect waits for him to pull his hand out of its holster in dramatic fashion, monitoring his movements for tracking purposes. Once the launcher has been armed, the Kinect watches for [Jonas] to pull his hands out of frame before firing the rocket.

We doubt you’ll see this thing controlling weapons for DARPA any time soon, but it’s cool nonetheless. The launcher seems to move a touch slowly, but we’re guessing that with an uprated servo, things could be a bit snappier.

Continue reading for a quick video of the Kinect-powered rocket launcher in action.

[via KinectHacks]

Continue reading “Controlling Weapons With Kinect”

Art Installation Lets You Be Your Own Souvenir

3d_souvenir_printer

The team at [blablabLAB] have been hard at work on their latest project, which they unleashed on the streets of Barcelona in the La Rambla pedestrian mall. Their art installation allows you to pose in the middle of the mall and receive a plastic statue of yourself as a souvenir.

Not unlike the “Fabricate Yourself” installation we saw a short time ago, this project also uses the Kinect to create a 3D representation of the subject, though it uses three separate sensors rather than just one. Each sensor is positioned around a centralized platform, creating a complete 3D model, which is then sent to a RapMan 3D printer stationed nearby.

Each user is then gifted a plastic representation of themselves to take home – it’s almost like an interactive human Mold-A-Rama. While the figures are neat, it would be great to see what sorts of plastic statues could be made using a higher resolution 3D printer like the one we featured a week ago.

Check out the video below to see the souvenir printer in action.

Continue reading “Art Installation Lets You Be Your Own Souvenir”

Kinect Hack Makes April Fools’ Prank A Reality

gmail_motion

Unless you have been hiding out in a cave for the last week or so, you have heard about this year’s April Fools’ joke from Google. Gmail Motion was purported to be an action-driven interface for Gmail, complete with goofy poses and gestures for completing everyday email tasks. Unfortunately it was all an elaborate joke and no gesture-based Gmail interface is forthcoming…at least not from Google.

The team over at the USC Institute for Creative Technologies have stepped up and made Google’s hoax a reality.  You might remember these guys from their Kinect-based World of Warcraft interface which used body motions to emulate in-game keyboard actions. Using their Flexible Action and Articulated Skeleton Toolkit (FAAST), they developed a Kinect interface for Gmail which they have dubbed the Software Library Optimizing Obligatory Waving (SLOOW).

Their skeleton tracking software allows them to use all of the faux gestures Google dreamed up for controlling your inbox, however impractical they might be. We love a good April Fools’ joke, but we really enjoy when they become reality via some clever thinking.

Stick around for a video demo of the SLOOW interface in action.

[via Adafruit]

Continue reading “Kinect Hack Makes April Fools’ Prank A Reality”

Kinect Produced Autostereograms (Magic Eye Pictures)

[Kyle McDonald], working collaboratively with [Golan Levin] at the Studio for Creative Inquiry, has come up with an application that can produce autostereograms. These are pictures that appear to be three-dimensional thanks to a visual illusion created by forcing your eyes to adjust focus and vergence differently than they normally would. The program is called ofxAutostereogram and it comes with a couple of examples. Both are show in [Kyle’s] video (embedded after the break), starting with a depth image of a shark. This combines with a texture tile, then is processed through the openFrameworks software in the package to produce the final image.

If that’s all it did you might find it rather unimpressive… these images have been around for some time although they were never so easy to produce on your home computer. But the second example is a pretty fantastic one. You can use a depth image from a Kinect as the starting point. As seen above, there is a preview window where you can adjust the clipping planes in order to include the correct depth. This also allows you a preview of your pose. Once it’s just right, snap a pick and process it through the software.

Continue reading “Kinect Produced Autostereograms (Magic Eye Pictures)”

R/C Car Controlled By An IPad Or Kinect

ipad_kinect_rc_control

R/C cars can be tons of fun, but sometimes the fun runs out after awhile. [Gaurav] got bored of steering around his R/C car with its remote, so he built an interface that lets him control the car using two different motion-detecting devices.

He built an HTML5 application for his iPad, which allows him to steer the car around. As you can see in the video below, the application utilizes the iPad’s tilt sensor to activate the car’s motors and steering depending on where on the screen he has moved the guide marker.

The second steering method he devised uses his Kinect sensor to track his movements. His hand gestures are mapped to a set of virtual spaces similar to those which the iPad uses. By moving his hands through these areas, the Arduino triggers the car’s remote just as it does with the iPad.

The actual remote control interface is achieved by wiring the car’s remote to an Arduino via a handful of opto-isolators. The Arduino is also connected to his computer via the serial port, where it waits for commands to be sent. In the case of the iPad, a Python server waits for commands to be issued from the HTML5 application. The Kinect’s interface is slightly different, with a C# application monitoring his movements and sending the commands directly to the serial port.

Check out the video below to see the car in action, and swing by his site if you are interested in grabbing some source code and giving it a try yourself.

Continue reading “R/C Car Controlled By An IPad Or Kinect”

Kinect Two-fer: MoCap Movie And Robot Control

kinect_twofer

It’s no mystery that we like the Kinect around here, which is why we’re bringing you a Kinect two-fer today.

We have seen video hacks using the Kinect before, and this one ranks up there on the coolness scale. In [Torben’s] short film about an animation student nearly missing his assignment deadline, the Kinect was used to script the animation of a stick figure model. The animation was captured and built in Maya, then overlaid on a separate video clip to complete the movie. The overall quality is great, though you can notice some of the typical “jitter” that the Kinect is known for, and there are a few places where the model sinks into the floor a bit.

If you want to try your hand at animation using the Kinect, all of the scripts used to make the movie are available on the creator’s site for free. [via Kinect-Hacks]

Our second Kinect item comes in the form of a gesture driven Lego MindStorms bot. Using OpenNI along with Primesense for body tracking, [rasomuro] was able to use simple motions to drive his NXT bot around the house. His movements are tracked by the Kinect sensor which are translated into commands relayed to the robot via his laptop’s Bluetooth connection. Since the robot has two motors, he mapped couple of simple arm motions to drive the bot around. We’ll be honest when we say that the motions remind us of Will Farrell’s “Frank the Tank” scene in Old School, but [rasomuro] says that he is trying to simulate the use of levers to drive the bot.  Either way, it’s pretty cool.

Videos of both hacks are embedded below for your perusal.

If you are interested in seeing some more cool Kinect hacks be sure to check out this Minecraft interface trio, this cool Kinect realtime video overlay, and this Kinect-Nerf gun video game interface.

Continue reading “Kinect Two-fer: MoCap Movie And Robot Control”