3D Render Live with Kinect and Bubble Boy

[Mike Newell] dropped us a line about his latest project, Bubble boy! Which uses the Kinect point cloud functionality to render polygonal meshes in real time.  In the video [Mike] goes through the entire process from installing the libraries to grabbing code off of his site. Currently the rendering looks like a clump of dough (nightmarishly clawing at us with its nubby arms).

[Mike] is looking for suggestions on more efficient mesh and point cloud code, as he is unable to run any higher resolution than what is in the video. You can hear his computer fan spool up after just a few moments rendering! Anyone good with point clouds?

Also, check out his video after the jump.

Continue reading “3D Render Live with Kinect and Bubble Boy”

Morphing robot demonstrated at IROS

morphing-robot

A morphing robot was demonstrated at the IROS conference this week. This orb has no rigid structure but uses some type of “inflation” system for locomotion. This robot concept is offered up by the iRobot company as part of a DARPA initiative they’re working on. The “inflation” is really a substance in the skin that can be converted from a liquid-like state to a solid-like one. They call this “The Jamming Concept” and give a layman’s explanation in the video we’ve embedded after the break.

When moving, this white ball is a churning, turning, bulging mass of terror. The just-about-to-hatch pods from Alien, or perhaps something from Doom 3 come to mind. The hexapod from IROS that we covered yesterday was amazing, but this really creeps us out. What’s more, this is footage from the iRobot prototypes of a year ago.  The newer stuff can do much more, like having several of these things glob together into one unit.

We’re glad that [DarwinSurvior] sent us the tip on this one, but now we’re not going to be able to sleep at night.

Continue reading “Morphing robot demonstrated at IROS”

Controlling spykee via web cam using your fingers

spykee

[epokh] sent in this cool project where he wrote some custom code to control the Spykee robot using gestures. He filters out everything but green through his web cam, then wraps his fingers in green tape. He then runs a series of filters to clean it up a little bit. The resulting “blobs” are tracked and converted to motor commands. You can see the setup in action in the video after the break. This guy might look familiar, as we posted a super quick head tracking rig he did with legos recently. Some of you mentioned, in the comments, that the legos were a waste, you’ll find that he thought so too, and ended up fabbing a simple rig to take the place of the legos.

Continue reading “Controlling spykee via web cam using your fingers”