Two Kinects Plus One HD Projector Makes The Coolest “Snowglobe” Ever

Looking for something to replace the flat screen display that was amazing in your house ten years ago?  How about a circular display similar to a snowglobe (crystal ball?) that will display the image you are watching correctly no matter at what angle you view it.

This amazing student project from Queens University combines elements that many hackers are familiar with, including: Kinect sensors, a 3-D projector, and a giant acrylic sphere.  Actually, most people have never worked with a giant acrylic sphere, but they look like fun.  Check out the video after the break. Continue reading “Two Kinects Plus One HD Projector Makes The Coolest “Snowglobe” Ever”

Amazing 3d Telepresence System

encumberance_free_telepresence_kinect

It looks like the world of Kinect hacks is about to get a bit more interesting.

While many of the Kinect-based projects we see use one or two units, this 3D telepresence system developed by UNC Chapel Hill student [Andrew Maimone] under the guidance of [Henry Fuchs] has them all beat.

The setup uses up to four Kinect sensors in a single endpoint, capturing images from various angles before they are processed using GPU-accelerated filters. The video captured by the cameras is processed in a series of steps, filling holes and adjusting colors to create a mesh image. Once the video streams have been processed, they are overlaid with one another to form a complete 3D image.

The result is an awesome real-time 3D rendering of the subject and surrounding room that reminds us of this papercraft costume. The 3D video can be viewed at a remote station which uses a Kinect sensor to track your eye movements, altering the video feed’s perspective accordingly. The telepresence system also offers the ability to add in non-existent objects, making it a great tool for remote technology demonstrations and the like.

Check out the video below to see a thorough walkthrough of this 3D telepresence system.

Continue reading “Amazing 3d Telepresence System”

Kinect-driven Cart Makes Shopping A Snap

wi_go

[Luis de Matos] is working on a neat Kinect project called Wi-GO that aims, as many do, to enhance the lives of individuals with disabilities. While the Wi-GO project is geared towards disabled persons, it can be quite helpful to the elderly and pregnant women as well.

Wi-GO is a motorized shopping cart with a Kinect sensor mounted on the back. The sensor interfaces with a laptop and functions much as you would as you would expect, scanning the area in front of the cart for objects and people. Once it identifies the individual it is meant to help, the cart diligently follows behind as the person goes about their typical shopping routine. The robot keeps a safe distance to avoid collisions, but remains within reach so that it can be used to carry goods.

If you take a look a the video below, you can see Wi-GO in action. It starts off by showing how difficult it would be for an individual in a wheel chair to use a shopping cart alone, and follows up by showing how much easier things are with Wi-GO in tow.

While the project is only in prototype form at the moment, we suspect that it will only be a matter of time until you see devices like Wi-GO in your local supermarket.

Continue reading “Kinect-driven Cart Makes Shopping A Snap”

Juggling With Kinect

kinect_juggling

Some of the Kinect hacks we have featured here are quite useful in the realm of assisted living, others showcase what can be done with the clever application of video filters. Some…are just plain fun.

This pair of Kinect hacks are not necessarily going to win any awards for usefulness, but they are big on fun. [Tom] put together a neat juggling application that watches for your hands to disappear behind your back, generating a glowing ball once they return to the camera’s field of vision. The balls can be tossed away or juggled as you can see in the video below. It looks like it could be pretty fun and most definitely easier than chasing balls around while learning to juggle.

[Tom’s] hack was based off code he saw demonstrated in a video by YouTube user [hogehoge335]. His application for the Kinect allows him to replicate the Kamehameha attack from Dragonball Z, flowing hair and all.

Check out the videos below for a demonstration of both Kinect hacks, and swing by the respective Google Code sites if you want to give them a try.

Continue reading “Juggling With Kinect”

Advanced Robotic Arm Control Using Kinect

kinect_teleoperation

[Ryan Lloyd], [Sandeep Dhull], and [Ruben D’Sa] wrote in to share a robotics project they have been keeping busy with lately. The three University of Minnesota students are using a Kinect sensor to remotely control a robotic arm, but it’s not as simple as it sounds.

Using OpenNI alongside PrimeSense, the team started out by doing some simple skeleton tracking before working with their robotic arm. The arm has five degrees of freedom, making the task of controlling it a bit tricky. The robot has quite a few joints to play with, so the trio not only tracks shoulder, elbow, and wrist movements, but they also monitor the status of the user’s hand to actuate the robot’s gripper.

When everything was said and done, the results were pretty impressive as you can see in the video below, but the team definitely sees room for improvement. Using inverse kinematics, they plan on filtering out some of the joint tracking inaccuracies that occur when the shoulders are moved in a certain way. They also plan on using a robotic arm with even more degrees of freedom to see just how well their software can perform.

Be sure to check out their site to see more details and videos.

Continue reading “Advanced Robotic Arm Control Using Kinect”

3D Render Live With Kinect And Bubble Boy

[Mike Newell] dropped us a line about his latest project, Bubble boy! Which uses the Kinect point cloud functionality to render polygonal meshes in real time.  In the video [Mike] goes through the entire process from installing the libraries to grabbing code off of his site. Currently the rendering looks like a clump of dough (nightmarishly clawing at us with its nubby arms).

[Mike] is looking for suggestions on more efficient mesh and point cloud code, as he is unable to run any higher resolution than what is in the video. You can hear his computer fan spool up after just a few moments rendering! Anyone good with point clouds?

Also, check out his video after the jump.

Continue reading “3D Render Live With Kinect And Bubble Boy”

Using Kinect To Make Human Marionettes

kinect_neurostim

[Choi Ka Fai] has been experimenting with neurostimulation for some time now. His body of work has focused on exploring the possibility of using neurostim devices to replay pre-recorded muscle movements.

Until now, he has been recording his muscle movements as acoustic waveforms for real-time playback in the bodies of his research partners. This usually requires him to sit beside the subject, tethered to a machine. This tends to limit his movement, so he has invested in a new form of movement recording technology – a Kinect sensor.

Using fairly standard skeleton tracking as we have seen in some previous Kinect hacks, he has enabled himself to direct the motion of his subject by merely moving in front of the camera. The benefit of using the Kinect over wired sensors is that he can use any body part to direct his partner’s movements by simply changing how the software interprets his actions. As you can see in the video below, he uses his hands, knees, and even his head to direct the motion of his partner’s arm.

It really is a neat application of the Kinect, and we are totally digging the shaky “human marionette” effect that it produces. Since this was only an initial test of the system, expect to see some more cool stuff coming from [Choi] in the near future.

Stick around to see a quick video of the Kinect-driven neurostim rig in action.

Continue reading “Using Kinect To Make Human Marionettes”