Official Kinect SDK Released

Microsoft just released the beta of the Kinect for Windows SDK. Although, “Microsoft does not condone the modification of its products” it appears Microsoft have changed their tune and released APIs for C++, C# and Visual Basic seven months after the Kinect was officially hacked.

We’ve seen libraries being developed since the launch of Kinect, culminating in the OpenKinect project. The Microsoft release covers the same ground as the OpenKinect project, and will hopefully improve on attempts to get audio out of the Kinect.

We’ve seen Kinect hacks run the gamut from telepresence, to robotics, to 3D modeling, so the Kinect seems like a great tool in the builder’s arsenal. The Kinect is a wonderful tool, and even though most of the functionality has already been replicated by the open-source community, it’s nice to know there’s official support for all the great projects we’ve seen.

Create And Conflagrate Giant Modeled Sculptures With Kinect And CNC

Summer has hit, and with it a bunch of crazy people going to crazy festivals and (often) burning crazy sculptures to crazy music! In that vein [Matthew Goodman] recently got involved in the burning flipside community down in Texas for his first big effigy build.  The project called for a gigantic archway flanked by two human shaped figures, since he had been working in Kinect [Matt] decided to try his hand at physically modeling the figures from Kinect mesh data.

After co-registering the depth and image cameras, setting up a capture routine to record, getting  .ply based meshes from the depth camera, and making a keypoint detector [Matt] was ready to start getting real world data from the Kinect. Armed with a ghetto steadycam built from his local Austin Hackerspace‘s spare parts bin, [Matt] proceeded to collect three 1.5 gigabyte scans of the charming [KT], who served as a model for the sculpture.

Once the meshes were imported to sketchup they could be merged and smoothed into a coherent form. The figure was split into CNC-able parts (known as the “lady bits” by [Matt] and his crew) and sent to local makers [Dave Umlas] and [Marrilee Ratcliff]’s ShopBot CNC mill. The 400 some odd bits of wood were then carted to flipside, methodically set up, and promptly set aflame the end of the event.

We have seen a couple of really interesting burning man projects, but this is possibly the shortest lived end result. Stay tuned this summer for more insane Black Rock City bound creations as well. Also don’t forget to check out [Matt]’s site for more details.

The Kinect Controlled Zombie Skeleton

Although there is no shortage of Kinect hacks out there, this one from Dashhacks seems especially cool.  According to them, the software part of this design uses a “modified OpenNI programming along with GlovePIE to send WiiMote commands to the cyborg such as jaw and torso movement along with MorphVOX to create the voice for the cybernetic monstrosity.” As pointed out in the video, this robotic zombie also has a “pause” feature, and a feature to loop movements like what would be done at an amusement park.

The other great thing about this hack is how well the skeleton is actuated via servo motors. Although it’s difficult to tell how many servos were used for this robot, it certainly has 10 or more degrees of freedom between the head, both arms, and the torso. To control all of this a hacked Wiimote and Nunchuck is used in conjunction with the Kinect. Check out the video after the break.

Continue reading “The Kinect Controlled Zombie Skeleton”

Amazing 3d Telepresence System

encumberance_free_telepresence_kinect

It looks like the world of Kinect hacks is about to get a bit more interesting.

While many of the Kinect-based projects we see use one or two units, this 3D telepresence system developed by UNC Chapel Hill student [Andrew Maimone] under the guidance of [Henry Fuchs] has them all beat.

The setup uses up to four Kinect sensors in a single endpoint, capturing images from various angles before they are processed using GPU-accelerated filters. The video captured by the cameras is processed in a series of steps, filling holes and adjusting colors to create a mesh image. Once the video streams have been processed, they are overlaid with one another to form a complete 3D image.

The result is an awesome real-time 3D rendering of the subject and surrounding room that reminds us of this papercraft costume. The 3D video can be viewed at a remote station which uses a Kinect sensor to track your eye movements, altering the video feed’s perspective accordingly. The telepresence system also offers the ability to add in non-existent objects, making it a great tool for remote technology demonstrations and the like.

Check out the video below to see a thorough walkthrough of this 3D telepresence system.

Continue reading “Amazing 3d Telepresence System”

Kinect-driven Cart Makes Shopping A Snap

wi_go

[Luis de Matos] is working on a neat Kinect project called Wi-GO that aims, as many do, to enhance the lives of individuals with disabilities. While the Wi-GO project is geared towards disabled persons, it can be quite helpful to the elderly and pregnant women as well.

Wi-GO is a motorized shopping cart with a Kinect sensor mounted on the back. The sensor interfaces with a laptop and functions much as you would as you would expect, scanning the area in front of the cart for objects and people. Once it identifies the individual it is meant to help, the cart diligently follows behind as the person goes about their typical shopping routine. The robot keeps a safe distance to avoid collisions, but remains within reach so that it can be used to carry goods.

If you take a look a the video below, you can see Wi-GO in action. It starts off by showing how difficult it would be for an individual in a wheel chair to use a shopping cart alone, and follows up by showing how much easier things are with Wi-GO in tow.

While the project is only in prototype form at the moment, we suspect that it will only be a matter of time until you see devices like Wi-GO in your local supermarket.

Continue reading “Kinect-driven Cart Makes Shopping A Snap”

Juggling With Kinect

kinect_juggling

Some of the Kinect hacks we have featured here are quite useful in the realm of assisted living, others showcase what can be done with the clever application of video filters. Some…are just plain fun.

This pair of Kinect hacks are not necessarily going to win any awards for usefulness, but they are big on fun. [Tom] put together a neat juggling application that watches for your hands to disappear behind your back, generating a glowing ball once they return to the camera’s field of vision. The balls can be tossed away or juggled as you can see in the video below. It looks like it could be pretty fun and most definitely easier than chasing balls around while learning to juggle.

[Tom’s] hack was based off code he saw demonstrated in a video by YouTube user [hogehoge335]. His application for the Kinect allows him to replicate the Kamehameha attack from Dragonball Z, flowing hair and all.

Check out the videos below for a demonstration of both Kinect hacks, and swing by the respective Google Code sites if you want to give them a try.

Continue reading “Juggling With Kinect”

Advanced Robotic Arm Control Using Kinect

kinect_teleoperation

[Ryan Lloyd], [Sandeep Dhull], and [Ruben D’Sa] wrote in to share a robotics project they have been keeping busy with lately. The three University of Minnesota students are using a Kinect sensor to remotely control a robotic arm, but it’s not as simple as it sounds.

Using OpenNI alongside PrimeSense, the team started out by doing some simple skeleton tracking before working with their robotic arm. The arm has five degrees of freedom, making the task of controlling it a bit tricky. The robot has quite a few joints to play with, so the trio not only tracks shoulder, elbow, and wrist movements, but they also monitor the status of the user’s hand to actuate the robot’s gripper.

When everything was said and done, the results were pretty impressive as you can see in the video below, but the team definitely sees room for improvement. Using inverse kinematics, they plan on filtering out some of the joint tracking inaccuracies that occur when the shoulders are moved in a certain way. They also plan on using a robotic arm with even more degrees of freedom to see just how well their software can perform.

Be sure to check out their site to see more details and videos.

Continue reading “Advanced Robotic Arm Control Using Kinect”