There’s a lot of tech that goes into animatronics, cosplay, and costumes. For their Hackaday Prize entry, [Dasaki] and [Dylan] are taking the eyes in a costume or Halloween prop to the next level with animatronic eyes that look where the wearer of this crazy confabulation is looking. It’s XEyes in real life, and it promises to be a part of some very, very cool costumes.
The mechanics of this system are actually pretty simple — it’s just a few servos joined together to make a pair of robotic eyes move up and down, and left to right. This entire mechanism is mounted on a frame, to which is attached a very small camera pointed directly at the user’s (real) eye. The software is where things get fun. That’s a basic eye-tracking setup, with IR light illuminating the pupil, and a compute unit that can calculate where the user is looking.
For the software, [Dasaki] and [Dylan] have collected a bunch of links, but right now the best solutions are the OpenMV and the Eye of Horus project from last year’s Hackaday Prize. It’s a great project, and a really fun entry for the Automation portion of this year’s Hackaday Prize.
Deep in the dark recesses of Internet advertisers and production studios, there’s a very, very strange device. It fits on a user’s head, has a camera pointing out, but also a camera pointing back at the user. That extra camera is aimed right at the eye. This is a gaze tracking system, a wearable robot that looks you in the eye and can tell you exactly what you were looking at. It’s exactly what you need to tell if people are actually looking at ads.
For their Hackaday Prize entry, Makeroni Labs is building an open source eye tracking system. It’s exactly what you want if you want to test how ‘sticky’ a webpage is, or – since we’d like to see people do something useful with their Hackaday Prize projects – for handicapped people that can not control their surroundings.
There are really only a few things you need to build an eye tracking camera – a pair of cameras and a bit of software. The cameras are just webcams, with the IR filters removed and a few IR LEDs aimed at the eye so the eye-facing camera can see the pupil. The second camera is pointed directly ahead, and with a bit of tricky math, the software can figure out where the user is looking.
The electronics are rather interesting, with all the processing running on a VoCore It’s Linux, though, and hopefully it’ll be fast enough to capture two video streams, calculate where the pupil is looking, and send another video stream out. As far as the rest of the build goes, the team is 3D printing everything and plans to make the design available to everyone. That’s great for experimentations in gaze tracking, and an awesome technology for the people who really need it.
There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.
Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.
To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.
Yes, using a computer to know where you’re looking may seem like a rather useless build, but stuff like this is used in research and extraordinarily high tech heads up displays. Although he’s not using the motion tracking on the Oculus, if [Diako] were to do so, he’d have the makings of one of the most powerful heads up displays possible.
Continue reading “Eye Tracking With The Oculus Rift”
The applications of eye-tracking devices are endless, which is why we always get excited to see new techniques in measuring the absolute position of the human eye. Cornell students [Michael and John] took on an interesting approach for their final project and designed a phototransistor based eye-tracking system.
We can definitely see the potential of this project, but for their first prototype, the system relies on both eye-tracking and head movement to fully control a mouse pointer. An end-product design was in mind, so the system consists of both a pair of custom 3D printed glasses and a wireless receiver; thus avoiding the need to be tethered to the computer under control . The horizontal position of the mouse pointer is controlled via the infrared eye tracking mechanism, consisting of an Infrared LED positioned above the eye and two phototransistors located on each side of the eye. The measured analog data from the phototransistors determine the eye’s horizontal position. The vertical movement of the mouse pointer is controlled with the help of a 3-axis gyroscope mounted to the glasses. The effectiveness of a simple infrared LED/phototransistor to detect eye movement is impressive, because similar projects we’ve seen have been camera based. We understand how final project deadlines can be, so we hope [Michael and John] continue past the deadline with this one. It would be great to see if the absolute position (horizontal and vertical) of the eye can be tracked entirely with the phototransistor technique.
Continue reading “PhotoTransistor Based Eye-Tracking”
Eye tracking is a really cool technology used in dozens of fields ranging from linguistics, human-computer interaction, and marketing. With a proper eye tracking setup, it’s possible for a web developer to see if their changes to the layout are effective, to measure how fast someone reads a page of text, and even diagnose medical disorders. Eye tracking setups haven’t been cheap, though, at least until now. Pupil is a serious, research-quality eye tracking headset designed by [Moritz] and [William] for their thesis at MIT.
The basic idea behind Pupil is to put one digital camera facing the user’s eye while another camera looks out on the world. After calibrating the included software, the headset looks at the user’s pupil to determine where they’re actually looking.
The hardware isn’t specialized at all – just a pair of $20 USB webcams, a LED, an infrared filter made from exposed 35mm film negatives, and a 3D printed headset conveniently for sale at Shapeways.
The software for Pupil is based on OpenCV and OpenGL and is available for Mac and Linux. Calibration is easy, as seen in the videos after the break, and the results are amazing for an eye tracking headset thrown together for under $100.
Continue reading “Build an eye tracking headset for $90”
This is a great hack, and it’s an advertisement. We wish this were the norm when it comes to advertising because they’ve really got our number. Skittles enlisted a few engineers to build a web interface that moves robot-powered candies.
When we started looking into this we figured that a few robots were covered with over-sized cases that looked like Skittles. But that’s not it at all. What you see above is actually upside down. The top side of the white surface has one tiny wheeled robot for each candy. A magnet was embedded in each Skittle which holds it to the underside of the surface. The user interface was rolled out on a Facebook page. It uses a common webcam for eye tracking. When you move your eyes, the robot controlling your assigned candy moves in that direction. See for yourself in the cllip after the break.
So we say bravo Mars Inc. We love it that you decided to show off what’s behind to curtain. As with the Hyundai pixel wall, there’s a whole subset of people who might ignore the ad, but will spend a lot of time to find out how it was done.
Continue reading “Webcam eye-tracking moves robot-powered Skittles candy”
[Bruce Land], professor at Cornell, is a frequent submitter to our tip line. Usually he sends in a few links every semester from undergraduate electronics courses. Now the fall semester is finally over and it’s time to move on to the more ambitious master’s projects.
First up is a head-mounted eye tracker, [Anil Ram Viswanathan] and [Zelan Xiao] put together a lightweight and low-cost eye tracking project that will record where the user is looking.
The eye tracker hardware is made of two cameras mounted on a helmet. The first camera faces forward, looking at the same thing the user is. The second camera is directed towards the user’s eye. A series of algorithms detect the iris of the user’s eye and overlays the expected gaze position on the output of the first camera. Here’s the design report. PDF, natch.
Next up is a face tracking project implemented on an FPGA. This project started out as a software implementation of a face tracking algorithm in MATLAB. [Thu-Thao Nguyen] translated this MATLAB code to Verilog and eventually got her hardware running on an FPGA dev board. Another design report.
Having a face detection and tracking system running on an FPGA is extremely interesting; the FPGA makes face tracking a very low power and hopefully lower-cost solution, allowing it to be used in portable and consumer devices.
You can check out the videos for these projects after the break.
Continue reading “Two computer vision builds from Cornell”