Using Kinect To Make Human Marionettes

kinect_neurostim

[Choi Ka Fai] has been experimenting with neurostimulation for some time now. His body of work has focused on exploring the possibility of using neurostim devices to replay pre-recorded muscle movements.

Until now, he has been recording his muscle movements as acoustic waveforms for real-time playback in the bodies of his research partners. This usually requires him to sit beside the subject, tethered to a machine. This tends to limit his movement, so he has invested in a new form of movement recording technology – a Kinect sensor.

Using fairly standard skeleton tracking as we have seen in some previous Kinect hacks, he has enabled himself to direct the motion of his subject by merely moving in front of the camera. The benefit of using the Kinect over wired sensors is that he can use any body part to direct his partner’s movements by simply changing how the software interprets his actions. As you can see in the video below, he uses his hands, knees, and even his head to direct the motion of his partner’s arm.

It really is a neat application of the Kinect, and we are totally digging the shaky “human marionette” effect that it produces. Since this was only an initial test of the system, expect to see some more cool stuff coming from [Choi] in the near future.

Stick around to see a quick video of the Kinect-driven neurostim rig in action.

Continue reading “Using Kinect To Make Human Marionettes”

Super Refined Kinect Physics Demo

kinect_demo

Since the Kinect has become so popular among hackers, [Brad Simpson] over at IDEO Labs finally purchased one for their office and immediately got to tinkering. In about 2-3 hours time, he put together a pretty cool physics demo showing off some of the Kinect’s abilities.

Rather than using rough skeleton measurements like most hacks we have seen, he paid careful attention to the software side of things. Starting off using the Kinect’s full resolution (something not everybody does) [Brad] took the data and manipulated it quite a bit before creating the video embedded below. Skeleton data was collected and run through several iterations of a smoothing algorithm to substantially reduce the noise surrounding the resulting outline.

The final product is quite a bit different than the Kinect videos we are used to seeing, and it drastically improves how the user is able to interact with virtual objects added to the environment. As you may have noticed, the blocks that were added to the video never rarely penetrate the outline of the individual in the movie. This isn’t due to some sort of digital trickery – [Brad] was able to prevent the intersection of different objects via his tweaking of the Kinect data feed.

We’re not sure how much computing power this whole setup requires, but the code is available from their Google Code repository, so we hope to see other projects refined by utilizing the techniques shown off here.

[via KinectHacks]

Continue reading “Super Refined Kinect Physics Demo”

Controlling Weapons With Kinect

kinect_rocket_launcher

It was only a matter of time before someone would figure out how to weaponize their Kinect. Hacker [Jonas Wagner] was fiddling with his Kinect one day and thought that it would be cool to launch missiles simply by gesturing. Not having any real missiles on hand, he settled for controlling a USB-powered foam missile launcher instead.

He mounted a webcam to the top of his rocket launcher to record video of his victims, and with a bit of Python along with the libfreenect ilbrary he was well on his way to world cubicle dominance.  The Kinect waits for him to pull his hand out of its holster in dramatic fashion, monitoring his movements for tracking purposes. Once the launcher has been armed, the Kinect watches for [Jonas] to pull his hands out of frame before firing the rocket.

We doubt you’ll see this thing controlling weapons for DARPA any time soon, but it’s cool nonetheless. The launcher seems to move a touch slowly, but we’re guessing that with an uprated servo, things could be a bit snappier.

Continue reading for a quick video of the Kinect-powered rocket launcher in action.

[via KinectHacks]

Continue reading “Controlling Weapons With Kinect”

Art Installation Lets You Be Your Own Souvenir

3d_souvenir_printer

The team at [blablabLAB] have been hard at work on their latest project, which they unleashed on the streets of Barcelona in the La Rambla pedestrian mall. Their art installation allows you to pose in the middle of the mall and receive a plastic statue of yourself as a souvenir.

Not unlike the “Fabricate Yourself” installation we saw a short time ago, this project also uses the Kinect to create a 3D representation of the subject, though it uses three separate sensors rather than just one. Each sensor is positioned around a centralized platform, creating a complete 3D model, which is then sent to a RapMan 3D printer stationed nearby.

Each user is then gifted a plastic representation of themselves to take home – it’s almost like an interactive human Mold-A-Rama. While the figures are neat, it would be great to see what sorts of plastic statues could be made using a higher resolution 3D printer like the one we featured a week ago.

Check out the video below to see the souvenir printer in action.

Continue reading “Art Installation Lets You Be Your Own Souvenir”

Kinect Hack Makes April Fools’ Prank A Reality

gmail_motion

Unless you have been hiding out in a cave for the last week or so, you have heard about this year’s April Fools’ joke from Google. Gmail Motion was purported to be an action-driven interface for Gmail, complete with goofy poses and gestures for completing everyday email tasks. Unfortunately it was all an elaborate joke and no gesture-based Gmail interface is forthcoming…at least not from Google.

The team over at the USC Institute for Creative Technologies have stepped up and made Google’s hoax a reality.  You might remember these guys from their Kinect-based World of Warcraft interface which used body motions to emulate in-game keyboard actions. Using their Flexible Action and Articulated Skeleton Toolkit (FAAST), they developed a Kinect interface for Gmail which they have dubbed the Software Library Optimizing Obligatory Waving (SLOOW).

Their skeleton tracking software allows them to use all of the faux gestures Google dreamed up for controlling your inbox, however impractical they might be. We love a good April Fools’ joke, but we really enjoy when they become reality via some clever thinking.

Stick around for a video demo of the SLOOW interface in action.

[via Adafruit]

Continue reading “Kinect Hack Makes April Fools’ Prank A Reality”

Kinect Produced Autostereograms (Magic Eye Pictures)

[Kyle McDonald], working collaboratively with [Golan Levin] at the Studio for Creative Inquiry, has come up with an application that can produce autostereograms. These are pictures that appear to be three-dimensional thanks to a visual illusion created by forcing your eyes to adjust focus and vergence differently than they normally would. The program is called ofxAutostereogram and it comes with a couple of examples. Both are show in [Kyle’s] video (embedded after the break), starting with a depth image of a shark. This combines with a texture tile, then is processed through the openFrameworks software in the package to produce the final image.

If that’s all it did you might find it rather unimpressive… these images have been around for some time although they were never so easy to produce on your home computer. But the second example is a pretty fantastic one. You can use a depth image from a Kinect as the starting point. As seen above, there is a preview window where you can adjust the clipping planes in order to include the correct depth. This also allows you a preview of your pose. Once it’s just right, snap a pick and process it through the software.

Continue reading “Kinect Produced Autostereograms (Magic Eye Pictures)”

R/C Car Controlled By An IPad Or Kinect

ipad_kinect_rc_control

R/C cars can be tons of fun, but sometimes the fun runs out after awhile. [Gaurav] got bored of steering around his R/C car with its remote, so he built an interface that lets him control the car using two different motion-detecting devices.

He built an HTML5 application for his iPad, which allows him to steer the car around. As you can see in the video below, the application utilizes the iPad’s tilt sensor to activate the car’s motors and steering depending on where on the screen he has moved the guide marker.

The second steering method he devised uses his Kinect sensor to track his movements. His hand gestures are mapped to a set of virtual spaces similar to those which the iPad uses. By moving his hands through these areas, the Arduino triggers the car’s remote just as it does with the iPad.

The actual remote control interface is achieved by wiring the car’s remote to an Arduino via a handful of opto-isolators. The Arduino is also connected to his computer via the serial port, where it waits for commands to be sent. In the case of the iPad, a Python server waits for commands to be issued from the HTML5 application. The Kinect’s interface is slightly different, with a C# application monitoring his movements and sending the commands directly to the serial port.

Check out the video below to see the car in action, and swing by his site if you are interested in grabbing some source code and giving it a try yourself.

Continue reading “R/C Car Controlled By An IPad Or Kinect”