Hackaday Prize Entry: DIY Eye Tracking

Deep in the dark recesses of Internet advertisers and production studios, there’s a very, very strange device. It fits on a user’s head, has a camera pointing out, but also a camera pointing back at the user. That extra camera is aimed right at the eye. This is a gaze tracking system, a wearable robot that looks you in the eye and can tell you exactly what you were looking at. It’s exactly what you need to tell if people are actually looking at ads.

For their Hackaday Prize entry, Makeroni Labs is building an open source eye tracking system. It’s exactly what you want if you want to test how ‘sticky’ a webpage is, or – since we’d like to see people do something useful with their Hackaday Prize projects – for handicapped people that can not control their surroundings.

There are really only a few things you need to build an eye tracking camera – a pair of cameras and a bit of software. The cameras are just webcams, with the IR filters removed and a few IR LEDs aimed at the eye so the eye-facing camera can see the pupil. The second camera is pointed directly ahead, and with a bit of tricky math, the software can figure out where the user is looking.

The electronics are rather interesting, with all the processing running on a VoCore It’s Linux, though, and hopefully it’ll be fast enough to capture two video streams, calculate where the pupil is looking, and send another video stream out. As far as the rest of the build goes, the team is 3D printing everything and plans to make the design available to everyone. That’s great for experimentations in gaze tracking, and an awesome technology for the people who really need it.

The 2015 Hackaday Prize is sponsored by:

Eye Tracking With The Oculus Rift

ocu

There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.

Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.

To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.

Yes, using a computer to know where you’re looking may seem like a rather useless build, but stuff like this is used in research and extraordinarily high tech heads up displays. Although he’s not using the motion tracking on the Oculus, if [Diako] were to do so, he’d have the makings of one of the most powerful heads up displays possible.

Continue reading “Eye Tracking With The Oculus Rift”

PhotoTransistor Based Eye-Tracking

eyetrack

The applications of eye-tracking devices are endless, which is why we always get excited to see new techniques in measuring the absolute position of the human eye. Cornell students [Michael and John] took on an interesting approach for their final project and designed a phototransistor based eye-tracking system.

We can definitely see the potential of this project, but for their first prototype, the system relies on both eye-tracking and head movement to fully control a mouse pointer. An end-product design was in mind, so the system consists of both a pair of custom 3D printed glasses and a wireless receiver; thus avoiding the need to be tethered to the computer under control . The horizontal position of the mouse pointer is controlled via the infrared eye tracking mechanism, consisting of an Infrared LED positioned above the eye and two phototransistors located on each side of the eye. The measured analog data from the phototransistors determine the eye’s horizontal position. The vertical movement of the mouse pointer is controlled with the help of a 3-axis gyroscope mounted to the glasses. The effectiveness of a simple infrared LED/phototransistor to detect eye movement is impressive, because similar projects we’ve seen have been camera based. We understand how final project deadlines can be, so we hope [Michael and John] continue past the deadline with this one. It would be great to see if the absolute position (horizontal and vertical) of the eye can be tracked entirely with the phototransistor technique.

Continue reading “PhotoTransistor Based Eye-Tracking”

Build An Eye Tracking Headset For $90

per_white_lines_faint_800

Eye tracking is a really cool technology used in dozens of fields ranging from linguistics, human-computer interaction, and marketing. With a proper eye tracking setup, it’s possible for a web developer to see if their changes to the layout are effective, to measure how fast someone reads a page of text, and even diagnose medical disorders. Eye tracking setups haven’t been cheap, though, at least until now. Pupil is a serious, research-quality eye tracking headset designed by [Moritz] and [William] for their thesis at MIT.

The basic idea behind Pupil is to put one digital camera facing the user’s eye while another camera looks out on the world. After calibrating the included software, the headset looks at the user’s pupil to determine where they’re actually looking.

The hardware isn’t specialized at all – just a pair of $20 USB webcams, a LED, an infrared filter made from exposed 35mm film negatives, and a 3D printed headset conveniently for sale at Shapeways.

The software for Pupil is based on OpenCV and OpenGL and is available for Mac and Linux. Calibration is easy, as seen in the videos after the break, and the results are amazing for an eye tracking headset thrown together for under $100.

Continue reading “Build An Eye Tracking Headset For $90”

Webcam Eye-tracking Moves Robot-powered Skittles Candy

robot-powered-skittles

This is a great hack, and it’s an advertisement. We wish this were the norm when it comes to advertising because they’ve really got our number. Skittles enlisted a few engineers to build a web interface that moves robot-powered candies.

When we started looking into this we figured that a few robots were covered with over-sized cases that looked like Skittles. But that’s not it at all. What you see above is actually upside down. The top side of the white surface has one tiny wheeled robot for each candy. A magnet was embedded in each Skittle which holds it to the underside of the surface. The user interface was rolled out on a Facebook page. It uses a common webcam for eye tracking. When you move your eyes, the robot controlling your assigned candy moves in that direction. See for yourself in the cllip after the break.

So we say bravo Mars Inc. We love it that you decided to show off what’s behind to curtain. As with the Hyundai pixel wall, there’s a whole subset of people who might ignore the ad, but will spend a lot of time to find out how it was done.

Continue reading “Webcam Eye-tracking Moves Robot-powered Skittles Candy”

Two Computer Vision Builds From Cornell

land

[Bruce Land], professor at Cornell, is a frequent submitter to our tip line. Usually he sends in a few links every semester from undergraduate electronics courses. Now the fall semester is finally over and it’s time to move on to the more ambitious master’s projects.

First up is a head-mounted eye tracker, [Anil Ram Viswanathan] and [Zelan Xiao] put together a lightweight and low-cost eye tracking project that will record where the user is looking.

The eye tracker hardware is made of two cameras mounted on a helmet. The first camera faces forward, looking at the same thing the user is. The second camera is directed towards the user’s eye. A series of algorithms detect the iris of the user’s eye and overlays the expected gaze position on the output of the first camera. Here’s the design report. PDF, natch.

Next up is a face tracking project implemented on an FPGA. This project started out as a software implementation of a face tracking algorithm in MATLAB. [Thu-Thao Nguyen] translated this MATLAB code to Verilog and eventually got her hardware running on an FPGA dev board. Another design report.

Having a face detection and tracking system running on an FPGA is extremely interesting; the FPGA makes face tracking a very low power and hopefully lower-cost solution, allowing it to be used in portable and consumer devices.

You can check out the videos for these projects after the break.

Continue reading “Two Computer Vision Builds From Cornell”

OpenCV Knows Where You’re Looking With Eye Tracking

[John] has been working on a video-based eye tracking solution using OpenCV, and we’re loving the progress. [John]’s pupil tracking software can tell anyone exactly where you’re looking and allows for free head movement.

The basic idea behind this build is simple; when looking straight ahead a pupil is perfectly circular. When an eye looks off to one side, a pupil looks more and more like an ellipse to a screen-mounted video camera. By measuring the dimensions of this ellipse, [John]’s software can make a very good guess where the eye is looking. If you want the extremely technical breakdown, here’s an ACM paper going over the technique.

Like the EyeWriter project this build was based on, [John]’s build uses IR LEDs around the edge of a monitor to increase the contrast between the pupil and the iris.

After the break are two videos showing the eyetracker in action. Watching [John]’s project at work is a little creepy, but the good news is a proper eye tracking setup doesn’t require the user to stare at their eye.

Continue reading “OpenCV Knows Where You’re Looking With Eye Tracking”