[Kyle McDonald] has kept himself busy working on 3D scanning in realtime. He’s posted a writeup that takes us through the concepts, tools, and assembly of a DIY 3d scanning camera. You should remember a preview of this method posted earlier this month, but now it’s time to build your own. You’ll need a camera, a projector, and some open source software to process the image data. Using these simple tools, [Kyle] turned out much better video than before. Take a look after the break to see his results from scanning at 60 fps using a PS3 Eye. The trick to this setup is getting the correct synchronization between the projector and the camera, something that could be improved with a bit of extra hacking.
They’ve come a long way since we last looked in on the progress. The hardware used is pretty much the same: a set of sunglasses sans lenses with the CCD from a Sony PlayStation 3 Eye mounted in front of one eye. IR LEDs surround the CCD and point at the eye to increase the contrast between pupil and the rest of the eye. The major improvement comes with the software. Eye tracking appears to be extremely precise and they’ve written a custom drawing program to take advantage of their interface. Check in on their developer page for source code and a video walk-through of the software.
After the break you can see video of [Tempt1] using the system to create some tags. We’re thankful for the success this project has seen as this guy can do a lot better with his eye than we can with our hands.
[Max] was happy to see that the PlayStation 3 Eye has support in the newer Linux kernels. Having sat in his closet for quite some time, this would give the camera another chance at usefulness. Unfortunately, the driver doesn’t include framerate selection and color correction so he set about writing a patch to control the color settings. As you can see above, his success greatly improves the image quality you get from the device.
We get the feeling that the camera peripherals for Sony’s gaming devices seem like a good idea but don’t have much staying power as a realistic gaming interface. With contributions like [Max’s], they can be re-purposed. The PS2 had its own, the EyeToy, which has long enjoyed driver support for Linux. The NUI Group does a lot of work with multi-touch and recommends the PS3 Eye for use with their projects because they’re inexpensive with high frame rates and decent picture quality.
Great work [Max]. It looks like he’s sent this patch upstream to be considered for incorporation into the kernel’s webcam module.
The NUI Group has been working hard to bring the PS3 Eye to windows. From the factory, this device has pretty impressive specs, but no windows drivers. After a bit of hacking, they’ve developed a driver for it and released it on their forums. The main reason they are so interested in it is that it can capture full frame at 60 frames per second, making it perfect for multi touch sensing. Now that they’ve got it working with windows, they’re working on a custom PS3 Eye filter for touchlib.
Opto-Isolator is an interesting art installation that was on display at the Bitforms Gallery in NYC. This single movement-tracking eye creates a statement about how we view art and is a response to the question “what if art could view us?”. The somewhat creepy display not only follows the person viewing it, but mimics blinks a second later and averts its gaze if eye contact is kept up for too long. Its creators [Golan Levin] and [Greg Baltus] have done a great job mimicking human behavior with such a simple element and the social implications of it are truly fascinating.
If they wanted to, [Levin] and [Baltus] could possibly crank up the spook factor by adding facial recognition and programming it to remember how certain people interact with it, then tailor its behavior to wink at different rates or become more shy or bold, depending on the personality of the person watching it. Of course, that would require that someone goes back to it more than once…
[Jason S. Babcock] and [Jeff B. Pelz] put together this paper on building a simple, lightweight eyetracker (PDF) to foster the creation of open source eyetracking software. All of the components are mounted to a cheap pair of safety glasses. The eyetracker uses a technique called “dark-pupil” illumination. An IR LED is used to illuminate the eye. The pupil appears as a dark spot because it doesn’t reflect the light. A bright spot also appears on the cornea where the IR is directly reflected. An eye camera is mounted next to the IR LED to record the image of eye with these two spots. Software tracks the difference between the two spots to determine the eye orientation. A laser mounted to the frame helps with the initial calibration process. A scene camera placed above the eye records what the eye is viewing. The video from these two cameras can be compared in real time or after the experiment is concluded.