Building A Zoetrope Using Kinect, Processing, And A Laser Cutter

A zoetrope is a device that contains a disk full with a series of images that make up and animation. A couple of different methods can be used to trick the eye into seeing a single animated image. In the past this was done by placing the images inside of a cylinder with slits at regular distances. When spun quickly, the slits appear to be stationary, with the images creating the animation. But the same effect can be accomplished using a strobe light.

The disk you see above uses the strobe method, but it’s design and construction is what caught our eye. The animated shapes were captured with a Kinect and isolated using Processing. [Greg Borenstein] takes a depth movie recorded while someone danced in front of a Kinect. He ran it through a Processing sketch and was able to isolate a set of slides that where then turned into the objects seen above using a laser cutter.

You can watch a video of this particular zoetrope after the break. But we’ve also embedded the Pixal 3D zoetrope clip which, although unrelated to this hack, is extremely interesting. Don’t have a laser cutter to try this out yourself? You could always build a zoetrope that uses a printed disk.

Continue reading “Building A Zoetrope Using Kinect, Processing, And A Laser Cutter”

Giving “sight” To The Visually Impaired With Kinect

NAVI

We have seen Kinect used in a variety of clever ways over the last few months, but some students at the [University of Konstanz] have taken Kinect hacking to a whole new level of usefulness. Rather than use it to control lightning or to kick around some boxes using Garry’s Mod, they are using it to develop Navigational Aids for the Visually Impaired, or NAVI for short.

A helmet-mounted Kinect sensor is placed on the subject’s head and connected to a laptop, which is stored in the user’s backpack. The Kinect is interfaced using custom software that utilizes depth information to generate a virtual map of the environment. The computer sends information to an Arduino board, which then relays those signals to one of three waist-belt mounted LilyPad Arduinos. The LilyPads control three motors, which vibrate in order to alert the user to obstacles. The group even added voice notifications via specialized markers, allowing them to prompt the user to the presence of doors and other specific items of note.

It really is a great use of the Kinect sensor, we can’t wait to see more projects like this in the future.

Stick around to see a quick video of NAVI in use.

[via Kinect-Hacks – thanks, Jared]

Continue reading “Giving “sight” To The Visually Impaired With Kinect”

The Evil Genius Simulator: Kinect Controlled Tesla Coils

The London Hackspace crew was having a tough time getting their Kinect demos running at Makefair 2011. While at the pub they had the idea of combining forces with Brightarcs Tesla coils and produced The Evil Genius Simulator!

After getting the go ahead from Brightarcs and the input specs of the coils they came up with an application in Openframeworks which uses skeletal tracking data to determine hand position. The hand position is scaled between two manually set calibration bars (seen in the video, below). The scaled positions then speeds or slows down a 50Hz WAV file to produce the 50-200Hz sin wave required by each coil. It only took an hour but the results are brilliant, video after the jump.

There are all these previously featured stories on the Kinect and  we’ve seen Tesla coils that respond to music, coils that  make music, and even MIDI controlled coils, nice to see it all combined.

Thanks to [Matt Lloyd] for the tip!

Continue reading “The Evil Genius Simulator: Kinect Controlled Tesla Coils”

3D Modeling Out Of Thin Air

kinect_3d_modeling

It seems that with each passing day, the Kinect hacks that we see become exponentially more impressive. Take for instance this little number that was sent to us today.

[sonsofsol] has combined several open source software packages and a little electronics know-how to create one of the more useful Kinect hacks we have seen lately. His project enables him to manipulate 3D models in GEM simply by moving his hands about in front of his Kinect sensor. Using OpenNI and Ubuntu, all of his actions are tracked by the computer and translated into actions within the GEM 3D engine.

To make things easier on himself, he also constructed a pair of electronic gloves that interface with the system. Using an Arduino, the gloves send different complex commands to the 3D modeling software, just by touching different pairs of fingers together.

You really need to take a look at the video embedded below to get a feel for how complex [sonsofsol’s] “simple” mesh modeler really is.

Looking for more Kinect fun? Check out these previously featured stories.

[Thanks, Jared]

Continue reading “3D Modeling Out Of Thin Air”

All About PS3 SixAxis Controller USB Communications

[Austyn] is currently working on reverse engineering a PlayStation 3 SixAxis controller’s USB communications. You may be thinking that this has already been done but [Austyn] was unable to find useful source code so he’s started his own project called libopenaxis.

The process he used to sniff out USB communications makes for an interesting read. He utilized GlovePIE to get the USB request block for the controller. With that in hand he grabbed the Python script used in a DIY Kinect hacking tutorial to start dumping controller data. With each keypress the script reads out the full data packet, which is used to figure out how the data structures are organized.

The project has come as far as knowing all of the data types, but right now the purpose for the majority of those variables is unknown. Hopefully the blanks will be filled in over time. Two things are for sure; if you’re interested in writing Python code that can communicate with PS3 controllers this is a great source of info, and the Kinect hacking that was so fun to watch over the last few months is still bearing fruit.

Kinect Hacked To Work With Garry’s Mod Means Endless Hours Of Virtual Fun

garrys_mod_kinect

[John B] is a software engineer and had some spare time on his hands, so he started messing around with his Kinect which had been sitting unused for awhile. He wanted to see what he could create if he was able to get Kinect data into a virtual environment that supported real-world physics. The first idea that popped into his head was to interface the Kinect with Garry’s Mod.

If you are not familiar with Garry’s Mod, it is a sandbox environment built on top of Valve’s Source engine. The environment supports real-world physics, but beyond that, it pretty much lets you do or build anything you want. [John] found that there was no good way to get Kinect data into the software, so he built his own.

He used OpenNI to gather skeletal coordinate data from Kinect, which was then passed to some custom code that packages those coordinates inside UDP packets. Those packets are then sent to a custom Lua script that is interpreted by Garry’s Mod.

The result is just plain awesome as you can see in the video below. Instead of simply playing some random game with the Kinect, you get to design the entire experience from the ground up. The project is still in its infancy, but it’s pretty certain that we’ll see some cool stuff in short order. All of the code is available on github, so give it a shot and share your videos with us.

Continue reading “Kinect Hacked To Work With Garry’s Mod Means Endless Hours Of Virtual Fun”

Chilling Drinks With Your Friends’ Faces

facecube

3D printing of Kinect-mapped models seems to be all the rage lately. [Nirav] caught the bug and has developed software which allows him to join in the fun. Frustrated by the lack of documentation and source code for the Fabricate Yourself project, he set out to create his own open-source process for scanning people and objects to share with the hacking community.

His software allows you to aim the Kinect and capture a 3D scan of any object, after which you need to use MeshLab or similar software to turn the scan into a STL file for printing. He says that the process is a bit tedious at the moment, but he is working hard to condense it down into a single step.

While he can scan and print pretty much anything he wants, his ultimate goal is to create ice cube trays for his friends featuring molds of their faces. The project has a lot of promise, though we’re not sure about our friends crunching on our faces after finishing their drink.