3D Modeling Out Of Thin Air


It seems that with each passing day, the Kinect hacks that we see become exponentially more impressive. Take for instance this little number that was sent to us today.

[sonsofsol] has combined several open source software packages and a little electronics know-how to create one of the more useful Kinect hacks we have seen lately. His project enables him to manipulate 3D models in GEM simply by moving his hands about in front of his Kinect sensor. Using OpenNI and Ubuntu, all of his actions are tracked by the computer and translated into actions within the GEM 3D engine.

To make things easier on himself, he also constructed a pair of electronic gloves that interface with the system. Using an Arduino, the gloves send different complex commands to the 3D modeling software, just by touching different pairs of fingers together.

You really need to take a look at the video embedded below to get a feel for how complex [sonsofsol’s] “simple” mesh modeler really is.

Looking for more Kinect fun? Check out these previously featured stories.

[Thanks, Jared]

14 thoughts on “3D Modeling Out Of Thin Air

  1. I’ve been saying to friends this is what I want to do with a kinect… so I’m grinning here.

    I can’t wait to see this taken further and to have the ability to “wear” the models….(think that scene in Iron Man when Stark “puts on” the arm design to test it out…)

  2. I did this years back with a WiiMote, Maya, IR LEDs, and a pair of neoprene wetsuit gloves.

    Because the WiiMote tracks up to four points, I used one constant-on IR LED on each glove to track position and another that would turn on if you pinched fingers together.

    It wasn’t quite as complex, but it was wireless and worked just fine.

  3. Amazing project. Now someone needs to take the next step and mix the Kinext with Microsoft Surface and set up a Minority Report style interface.

    I am very impressed. Great job.

  4. haha! kudo’s to the lego technic + CD-rom casing house for the arduino. Arduino must really feel at home there…

    I love 3D modeling and arduino. It’s cool to see them combined. However the kinect stays a toy. I notice the application is too bad in sensing the depth of the limbs. It’s very coarse, although better than nothing.

  5. I see good things in the future. The Kinect is a first generation device and all of these projects are made by tinkerers in their spare time. With newer generations of the Kinect (or it’s competitors/derivatives) and more bodies behind the projects the potential is immense.

  6. As a 3D modeleer/artist myself I’m just dieing to try this sort of tech out.
    I have also been wondering if eye-tracking could be done accurately enough to pinpoint the 3d point they are looking at – that might also be a good way to model or at least have a 3d cursor.

    “Now we need some good holographic projection”

    Or, more realisticly, some Augmented Reality specs :)

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.