Portabilizing The Kinect

Way back when the Kinect was first released, there was a realization that this device would be the future of everything 3D. It was augmented reality, it was a new computer interface, it was a cool sensor for robotics applications, and it was a 3D scanner. When the first open source driver for the Kinect was released, we were assured that this is how we would get 3D data from real objects into a computer.

Since then, not much happened. We’re not using the Kinect for a UI, potato gamers were horrified they would be forced to buy the Kinect 2 with the new Xbox, and you’d be hard pressed to find a Kinect in a robot. 3D scanning is the only field where the Kinect hasn’t been over hyped, and even there it’s still a relatively complex setup.

This doesn’t mean a Kinect 3D scanner isn’t an object of desire for some people, or that it’s impossible to build a portabilzed version. [Mario]’s girlfriend works as an archaeologist, and having a tool to scan objects and places in 3D would be great for her. Because of this, [Mario] is building a handheld 3D scanner with a Raspberry Pi 2 and a Kinect.

This isn’t the first time we’ve seen a portablized Kinect. Way back in 2012, the Kinect was made handheld with the help of a Gumstix board. Since then, a million tiny ARM single board computers have popped up, and battery packs are readily available. It was only a matter of time until someone stepped up to the plate, and [Mario] was the guy.

The problem facing [Mario] isn’t hardware. Anyone can pick up a Kinect at Gamestop, the Raspberry Pi 2 should be more than capable of reading the depth sensor on the Kinect, and these parts can be tied together with 3D printed parts. The real problem is the software, and so far [Mario] has Libfreenect compiling without a problem on the Pi2. The project still requires a lot of additional libraries including some OpenCV stuff, but so far [Mario] has everything working.

You can check out his video of the proof of concept below.

Continue reading “Portabilizing The Kinect”

Interactive Fur Mirror Follows Your Every Move

We think artist [Daniel Rozin] spent a bit too much time wondering if he could make an interactive fur mirror, without wondering if he should. The result is… strange — to say the least.

It’s called the PomPom Mirror, and its one of many interactive installations in the Descent With Modification at Bitforms — there’s even a super cute flock of penguins which spin around to create the same effect.

The mirror is 4 by 4 feet and 18″ deep. It has 928 faux fur pom poms which are controlled by 464 motors, each effectively with an “on” and “off” state. A Microsoft Kinect tracks movement and creates a black and white binary image of what it sees. The artist also programmed in a few animation sequences which make the mirror come alive — like some weird furry alien / plant thing…

Continue reading “Interactive Fur Mirror Follows Your Every Move”

Printing Photorealistic Images on 3D Objects

Hydrographic Printing is a technique of transferring colored inks on a film to the surface of an object. The film is placed on water and activated with a chemical that allows it to adhere to an object being physically pushed onto it. Researchers at Zhejiang University and Columbia University have taken hydrographic printing to the next level (pdf link). In a technical paper to be presented at ACM SIGGRAPH 2015 in August, they explain how they developed a computational method to create complex patterns that are precisely aligned to the object.

Typically, repetitive patterns are used because the object stretches the adhesive film; anything complex would distort during this subjective process. It’s commonly used to decorate car parts, especially rims and grills. If you’ve ever seen a carbon-fiber pattern without the actual fiber, it’s probably been applied with hydrographic printing.

print_tThe physical setup for this hack is fairly simple: a vat of water, a linear motor attached to a gripper, and a Kinect. The object is attached to the gripper. The Kinect measures its location and orientation. This data is applied to a 3D-scan of the object along with the desired texture map to be printed onto it. A program creates a virtual simulation of the printing process, outputting a specific pattern onto the film that accounts for the warping inherent to the process. The pattern is then printed onto the film using an ordinary inkjet printer.

The tiger mask is our personal favorite, along with the leopard cat. They illustrate just how complex the surface patterns can get using single or multiple immersions, respectively. This system also accounts for objects of a variety of shapes and sizes, though the researchers admit there is a physical limit to how concave the parts of an object can be. Colors will fade or the film will split if stretched too thin. Texture mapping can now be physically realized in a simple yet effective way, with amazing results.

Continue reading “Printing Photorealistic Images on 3D Objects”

ANUBIS, A Natural User Bot Interface System

[Matt], [Andrew], [Noah], and [Tim] have a pretty interesting build for their capstone project at Ohio Northern University. They’re using a Microsoft Kinect, and a Leap Motion to create a natural user interface for controlling humanoid robots.

The robot the team is using for this project is a tracked humanoid robot they’ve affectionately come to call Johnny Five.  Johnny takes commands from a computer, Kinect, and Leap motion to move the chassis, arm, and gripper around in a way that’s somewhat natural, and surely a lot easier than controlling a humanoid robot with a keyboard.

The team has also released all their software onto Github under an open source license. You can grab that over on the Gits, or take a look at some of the pics and videos from the Columbus Mini Maker Faire.

Super Smash Bros Gets a Revamp with the Microsoft Kinect

[Eric] just sent in this awesome Kinect hack that he and a few friends worked on. Playing Super Smash Bros with a Kinect.

The system makes use of two Kinects, and three PCs. The first Kinect records each individual players moves, while the second Kinect watches both players “fight” each other. The first PC runs an Nintendo 64 emulator to play the game.character selection

The second PC runs a camera with OpenCV to add another cool but perhaps unnecessary feature, you see, even the character selection is a physical process, adding to the idea of playing the entire game with your body. A glass table allows players to set their 3D printed token onto the glass, effectively placing it on the character they would like to use.

And when the match ends, a windshield wiper knocks off the losing player’s token from the table.

The third PC is responsible for running both Kinects, which then has to send the resulting commands back to first PC over a TCP connection for input into the game.

They introduced it to the public at MHacks Fall 2014, a hacking competition sponsored by Dell and Intel. Video Below.

Continue reading “Super Smash Bros Gets a Revamp with the Microsoft Kinect”

Robotic Terminator Teddy Will Protect You While You Sleep

This animatronic teddy bear is the stuff of nightmares… or dreams if you’re into mutant robot toys. In either case, this project by [Erwin Ried] is charming and creepy, as he gives life to an unassuming stuffed animal by implanting it with motorized parts.

[Erwin] achieves several degrees of motion throughout the bear’s body by filling the skin with a series of 3D printed bones, conjoined by servo motors at its shoulders, elbows and neck. The motors are controlled via an Arduino running slave to a custom application written in C#. This application uses the motion tracking and facial recognition features of the Xbox Kinect, mapping the input from the puppeteer’s movement to the motors of the doll’s skeleton. Additionally, two red LEDs illuminate under the bear’s cheeks in response to the facial expression of the person controlling it, as an additional reminder that teddy feels what you feel.

bearSkeleton

In [Erwin’s] video, he demonstrates what his application sees through the Kinect’s camera side-by-side with the mechanical skeleton its controlling. The finished product isn’t something I’d soon cuddle up to at night, but looks amazing and is fun to watch in action :

Continue reading “Robotic Terminator Teddy Will Protect You While You Sleep”

Creepy Cat Eyes with a Microsoft Kinect

F8X0VISHV3Q1HLI.LARGE

Ever feel like someone is watching you? Like, somewhere in the back of your mind, you can feel the peering eyes of something glancing at you? Tapping into that paranoia, is this Computer Science graduate project that was created during a “Tangible Interactive Computing” class at the University of Maryland by two bright young students named [Josh] and [Richard], with the help of HCIL hackerspace.

Their Professor [Dr. Jon Froehlich] wanted the students to ‘seamlessly couple the dual worlds of bits and atoms’ and create something that would ‘explore the materiality of interactive computing.’ And this relatively simple idea does just that, guaranteeing some good reactions. 

As you’ve probably gathered from the title, this project uses a Microsoft Kinect to track the movement of nearby people. The output is then translated into actionable controls of the mounted eyeballs producing a creepy vibe radiating out from the feline, robot poster.

Continue reading “Creepy Cat Eyes with a Microsoft Kinect”