Kinect + Minecraft Trifecta

Today we have a special treat, three projects combining the “fastest selling consumer electronics device”, Kinect, and the “fastest selling indie java game that once kept us from sleeping for an entire weekend”, Minecraft!

[Sean Oczkowski] writes in to tell us about his efforts to play Minecraft with Kinect using no more than the OpenKinect Java wrapper on Ubuntu.  The code was written in about 4 days with some help from Wikipedia.  Using  histograms to locate the player in the field of view, the script calculates the center mass of the body and defines interactions for the limb occupying that quadrant of the screen. [Sean] does an excellent job of running through the whole process as well as the decisions made along the way. The whole thing is a bit like running in place, and we can’t imagine the flailing that will occur during the inevitable creeper encounter.

Next we have  [Wade McGillis] with his award winning Minecraft Kinect Controller. [Wade] provides source code and executables at his site. This version of control uses skeletal tracking data to sense the user’s gestures. This still involves holding your hands out like a zombie but it is a bit more versatile as one can pass their arms in front of their own body.

Finally [Nathan Viniconis] has been doing some very interesting work using the Kinect to import giant three dimensional models into the game world. [Nathan] then goes the extra mile and animates the figures! Check out the video below for the really impressive results. We here at Hackaday feel that this is the most appropriate use of this technology, and may begin building gigantic statues of ourselves on public servers.

Check out the the tricrafta (minefecta?) of videos after the jump!

Continue reading “Kinect + Minecraft Trifecta”

Real-time Wireframe Video Effect Overlay With Kinect

linedance

[Francois] over at 1024 Architecture has been working on a project we think you’ll be likely to see in a professional music video before too long. Using his Kinect sensor, ha has been tracking skeletal movements, adding special effects to the resulting wire frame with Quartz Composer. While this idea isn’t new, the next part is. He takes the QC tweaked video stream and then projects it back over the performer using MadMapper to match the video to the body movements, recording the resultant display.

The project started out with a few hiccups, including a noticeable delay between the body tracking and the display. It caused the performer to have to move more slowly than he would like, so things had to be tweaked. [Francois] first tested the latency between his computer and the projector by displaying a timecode representation on the screen as well as via the projector. He found the projector to have a latency of 1 frame at 60 fps, which wasn’t too bad. This led him to believe the culprit was his Kinect, and he was right.  There was a 6 frame delay, so he locked the video output to 30 fps in hopes of cutting that delay in half.

The effect is slightly reminiscent of Tron, but with more distortion. We can’t wait to see more projects similar to this one in the future.

The resulting video embedded below is pretty cool in our opinion, but you can judge for yourself.

Continue reading “Real-time Wireframe Video Effect Overlay With Kinect”

Clever Hack Tethers A Kinect Sensor To The PS3

kinect_for_ps3

Now that Kinect has been hacked to work with just about everything from robots to toaster ovens, someone finally got around to tweaking it for use on the PS3.

[Shantanu] has been hard at work writing code and experimenting with some preexisting Kinect software to get the sensor to talk to his PS3. The Kinect is hooked up to a PC, which captures all of his movements with OpenNI. Those movements are mapped to PS3 controls via NITE, a piece of middleware used for interpreting gestures into commands. All of the captured button presses are then relayed to the PS3 over a Bluetooth connection using DIYPS3Controller.

As you can see in the video below, the solution works pretty well for what should be considered pre-alpha code. He has been able to map several custom gestures to button presses, and the Kinect does an overall decent job tracking his limbs and translating their movements to on-screen actions. The actual in-game use is a bit rough at the moment, but aside from the infancy of the code, you have to remember that these games were never meant to be played with the Kinect.

It’s a job well done, and we can’t wait to see where this project goes.

Looking for more Kinect fun? Look no further than right here.

[via Kinect-Hacks]

Continue reading “Clever Hack Tethers A Kinect Sensor To The PS3”

People-tracking Orb Demo Makes Us Want To Build Our Own

kinect_eye

Earlier this week, we came across a video of an orb-based eyeball that would follow you throughout the room, based on data gathered from a Kinect sensor. Try as we might, we couldn’t find much more than the video, but it seems that the guys behind the project have spoken up in a recent blog post.

[Jon George] of The Design Studio UK explained that the person-tracking eyeball visualization was built using a PC, a Kinect, and a product called the Puffersphere, which projects a 360 degree image on the inside of a glass orb. A panoramic image is converted for use by the special lens inside the sphere by applying a filter which warps the image into a circular shape.

After the image has been created, a simple Windows app is used in conjunction with the OpenNI framework that allows the image to follow you around the room.

The only problem with this fun little project is the price of the sphere – we’re not sure what it is exactly, but rest assured it is more than we are willing to pay for such a toy. We’re thinking there has to be a way to simulate the orb’s effect to some degree using cheaper hardware. It’s possible that it could be done using a small-scale DIY version of this spherical mirror projection build, though it consists of concave half-spheres rather than full orbs.

In the meantime, take a look at these two videos of the orb in action. Don’t worry – we know you were totally thinking about the Eye of Sauron, so the second video should not disappoint.

Continue reading “People-tracking Orb Demo Makes Us Want To Build Our Own”

Giving “sight” To The Visually Impaired With Kinect

NAVI

We have seen Kinect used in a variety of clever ways over the last few months, but some students at the [University of Konstanz] have taken Kinect hacking to a whole new level of usefulness. Rather than use it to control lightning or to kick around some boxes using Garry’s Mod, they are using it to develop Navigational Aids for the Visually Impaired, or NAVI for short.

A helmet-mounted Kinect sensor is placed on the subject’s head and connected to a laptop, which is stored in the user’s backpack. The Kinect is interfaced using custom software that utilizes depth information to generate a virtual map of the environment. The computer sends information to an Arduino board, which then relays those signals to one of three waist-belt mounted LilyPad Arduinos. The LilyPads control three motors, which vibrate in order to alert the user to obstacles. The group even added voice notifications via specialized markers, allowing them to prompt the user to the presence of doors and other specific items of note.

It really is a great use of the Kinect sensor, we can’t wait to see more projects like this in the future.

Stick around to see a quick video of NAVI in use.

[via Kinect-Hacks – thanks, Jared]

Continue reading “Giving “sight” To The Visually Impaired With Kinect”

The Evil Genius Simulator: Kinect Controlled Tesla Coils

The London Hackspace crew was having a tough time getting their Kinect demos running at Makefair 2011. While at the pub they had the idea of combining forces with Brightarcs Tesla coils and produced The Evil Genius Simulator!

After getting the go ahead from Brightarcs and the input specs of the coils they came up with an application in Openframeworks which uses skeletal tracking data to determine hand position. The hand position is scaled between two manually set calibration bars (seen in the video, below). The scaled positions then speeds or slows down a 50Hz WAV file to produce the 50-200Hz sin wave required by each coil. It only took an hour but the results are brilliant, video after the jump.

There are all these previously featured stories on the Kinect and  we’ve seen Tesla coils that respond to music, coils that  make music, and even MIDI controlled coils, nice to see it all combined.

Thanks to [Matt Lloyd] for the tip!

Continue reading “The Evil Genius Simulator: Kinect Controlled Tesla Coils”

3D Modeling Out Of Thin Air

kinect_3d_modeling

It seems that with each passing day, the Kinect hacks that we see become exponentially more impressive. Take for instance this little number that was sent to us today.

[sonsofsol] has combined several open source software packages and a little electronics know-how to create one of the more useful Kinect hacks we have seen lately. His project enables him to manipulate 3D models in GEM simply by moving his hands about in front of his Kinect sensor. Using OpenNI and Ubuntu, all of his actions are tracked by the computer and translated into actions within the GEM 3D engine.

To make things easier on himself, he also constructed a pair of electronic gloves that interface with the system. Using an Arduino, the gloves send different complex commands to the 3D modeling software, just by touching different pairs of fingers together.

You really need to take a look at the video embedded below to get a feel for how complex [sonsofsol’s] “simple” mesh modeler really is.

Looking for more Kinect fun? Check out these previously featured stories.

[Thanks, Jared]

Continue reading “3D Modeling Out Of Thin Air”