Kinect Hacked To Work With Garry’s Mod Means Endless Hours Of Virtual Fun

garrys_mod_kinect

[John B] is a software engineer and had some spare time on his hands, so he started messing around with his Kinect which had been sitting unused for awhile. He wanted to see what he could create if he was able to get Kinect data into a virtual environment that supported real-world physics. The first idea that popped into his head was to interface the Kinect with Garry’s Mod.

If you are not familiar with Garry’s Mod, it is a sandbox environment built on top of Valve’s Source engine. The environment supports real-world physics, but beyond that, it pretty much lets you do or build anything you want. [John] found that there was no good way to get Kinect data into the software, so he built his own.

He used OpenNI to gather skeletal coordinate data from Kinect, which was then passed to some custom code that packages those coordinates inside UDP packets. Those packets are then sent to a custom Lua script that is interpreted by Garry’s Mod.

The result is just plain awesome as you can see in the video below. Instead of simply playing some random game with the Kinect, you get to design the entire experience from the ground up. The project is still in its infancy, but it’s pretty certain that we’ll see some cool stuff in short order. All of the code is available on github, so give it a shot and share your videos with us.

Continue reading “Kinect Hacked To Work With Garry’s Mod Means Endless Hours Of Virtual Fun”

Chilling Drinks With Your Friends’ Faces

facecube

3D printing of Kinect-mapped models seems to be all the rage lately. [Nirav] caught the bug and has developed software which allows him to join in the fun. Frustrated by the lack of documentation and source code for the Fabricate Yourself project, he set out to create his own open-source process for scanning people and objects to share with the hacking community.

His software allows you to aim the Kinect and capture a 3D scan of any object, after which you need to use MeshLab or similar software to turn the scan into a STL file for printing. He says that the process is a bit tedious at the moment, but he is working hard to condense it down into a single step.

While he can scan and print pretty much anything he wants, his ultimate goal is to create ice cube trays for his friends featuring molds of their faces. The project has a lot of promise, though we’re not sure about our friends crunching on our faces after finishing their drink.

Encase Yourself In Carbonite With Kinect

kinect_plastic_renders

There never seems to be a lull in the stream of new and novel hacks that people create around Microsoft’s Kinect. One of the more recent uses for the device comes from [Interactive Fabrication] and allows you to fabricate yourself, in a manner of speaking.

The process uses the Kinect to create a 3D model of a person, which is then displayed on a computer monitor. Once you have selected your preferred pose, a model of the image is rendered by a 3D plastic printer. Each scan results in a 3cm x 3cm plastic model complete with snap together dovetail joints allowing the models to be combined together. A full body scan can be constructed with three of these tiles, resulting in a neat “Han Solo trapped in Carbonite” effect.

Currently only about 1/3 of Kinect’s full resolution is being used to create these models, which is pretty promising news to those who would try this at home. Theoretically, you should be able to create larger, more detailed images of yourself provided you have a 3D printer at your disposal.

Keep reading for a quick video presentation of the fabrication process.

Continue reading “Encase Yourself In Carbonite With Kinect”

Kinect Home Theater Control

kinect_home_theater

[Harishankar] has posted a video on his blog demonstrating the ability to control devices using the Microsoft Kinect sensor via IR. While controlling devices with Kinect is nothing new, he is doing something a little different than you have seen before. The Kinect directly interfaces with his Mac Mini and tracks his movements via OpenNI. These movements are then compared to a list of predefined gestures, which have been mapped to specific IR functions for controlling his home theater.

Once the gestures have been acknowledged, they are then relayed from the Mac via a USB-UIRT to various home theater components. While there are not a lot of details fleshed out in the blog post, [Harishankar] says he will gladly forward his code to you if you request it via email.

Thanks to [Peter] for the tip.

ROS Gains Full Body Telemetry

[Taylor Veldrop] has been playing with an NAO robot and ROS, mixed with a Kinect to get some pretty amazing results. The last time we saw any work done with ROS and the Kinect, it was allowing some basic telemetry using the PR2. [Tyler] has taken this a step further allowing for full body control of the NAO robot. Basic mimicking mixed with a little bit of autonomy allow the NAO to follow his steps around a room and even slice a bananna, or hammer nails. We think this is pretty impressive, especially if he were to mix it together with a motion tracking stereoscopic display. Follow along after the break to see it pull off some of these cool feats.

Continue reading “ROS Gains Full Body Telemetry”

VR! Now With More Kinect, Wiimote, And Vuzix

Those of us that remember when you could actually go to a mall and play on a VR game machine, tend to remember it fondly. What happened? The computing horsepower has grown so much, our graphics now days are simply stunning, yet there’s been no major VR revival. Yeah, those helmets were huge and gave you a headache, but it was worth it.  With the 3d positioning abilities of the latest game crazes, the Wiimote and the Kinect, [Nao_u] is finally taking this where we all knew it should have gone(google translated). Well, maybe we would have had less creepy anime faces flying around squirting ink, but the basics are there. He has created a VR system utilizing the Wiimote for his hand position, a Vuzix display for head positioning, and the kinect for body tracking. Even with the creepy flying heads I want to play it, especially after seeing him physically ducking behind boxes in the video after the break. Long live VR!

Continue reading “VR! Now With More Kinect, Wiimote, And Vuzix”

Projector Tricks Make Use Of Kinect 3D Mapping

[Don’t stop the clock] is doing some work with a projector, a camera, and the Kinect. What he’s accomplished is quite impressive, combining the three to manipulate light with your body. The image above is a safer rendition of the Hadouken from the Street Fighter video games, throwing light across the room instead of fire. This comes at the end of the video after the break, but first he’ll show off the core features of the system. You can hold up your hand and wave it to turn it into a light source. In other words, the projector will shine light on your hand, moving it, and manipulating the intensity based on hand location in 3D space. Since the Kinect is sending fairly precise data back to the computer the projected image is trimmed to match your hand and arm without overflowing onto the rest of the room until you touch your hand to a surface you want illuminated or throw the light source with a flick or the wrist. It may seem trivial at first glance, but we find the alignment of the projector and the speed at which the image updates to be quite impressive.

Continue reading “Projector Tricks Make Use Of Kinect 3D Mapping”