[Don’t stop the clock] is doing some work with a projector, a camera, and the Kinect. What he’s accomplished is quite impressive, combining the three to manipulate light with your body. The image above is a safer rendition of the Hadouken from the Street Fighter video games, throwing light across the room instead of fire. This comes at the end of the video after the break, but first he’ll show off the core features of the system. You can hold up your hand and wave it to turn it into a light source. In other words, the projector will shine light on your hand, moving it, and manipulating the intensity based on hand location in 3D space. Since the Kinect is sending fairly precise data back to the computer the projected image is trimmed to match your hand and arm without overflowing onto the rest of the room until you touch your hand to a surface you want illuminated or throw the light source with a flick or the wrist. It may seem trivial at first glance, but we find the alignment of the projector and the speed at which the image updates to be quite impressive.
[Thanks Vasili]
The Kinect has come full circle. Johnny Lee started with this: http://hackaday.com/2007/11/15/automatic-projector-calibration/, and then went on to (help) design the Kinect. Now that the Kinect exists, it is being used for the same thing all over again.
Very cool! Reminded me of this old gem: http://hackaday.com/2007/11/15/automatic-projector-calibration/
I am always surprised by the rotated 3D rendering we see at the beginning of the video, it has a high coolness factor. I could play with this for hours.
Wow, Awesome! This is the basis of sensorless Mocap. I wonder how long untill us 3d animators are out of a job due to getting replaced with a machine!
sweet dreams!
The ‘jaw-drop’ moment for me here was when he approached the wall, and the projector dropped the light source on the wall, super impressive!!
Mirroring his projection would be cool too. He could dance with himself.
This is an interesting idea. I’d think it’d be neat to project “skins” onto people in the scene. All his future videos could be presented by him in a projected suit and tie. Or like the singing ghost heads from a while back, he could wear some sort of mask and have a virtual head projected on it.
“jaw drop” why? it’s simply projector alignment. I can do the same with a gyromouse or a wiimote.
What is impressive is the auto calibrate and that it does not lose the right hand and accidentally start following the left hand.
thing with the kinect are happening fast: during the first days, using only the partial open driver you could aready do great stuff; in a few day I could get a proper calibration and map virtual objects that interact with physical ones: http://geekjutsu.wordpress.com/2010/12/02/dar-deep-augmented-reality/
now with OpenNI you get hand and skeleton traking outside the box, which is really awesome. there are 1k things i can’t wait to try…
@fartface: except for at 3:31
awesome =)
I could imagine some pretty epic scenarios when coupling this with another projector and some 3D glasses…