There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.
Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.
To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.
Yes, using a computer to know where you’re looking may seem like a rather useless build, but stuff like this is used in research and extraordinarily high tech heads up displays. Although he’s not using the motion tracking on the Oculus, if [Diako] were to do so, he’d have the makings of one of the most powerful heads up displays possible.
Continue reading “Eye Tracking With The Oculus Rift”
The applications of eye-tracking devices are endless, which is why we always get excited to see new techniques in measuring the absolute position of the human eye. Cornell students [Michael and John] took on an interesting approach for their final project and designed a phototransistor based eye-tracking system.
We can definitely see the potential of this project, but for their first prototype, the system relies on both eye-tracking and head movement to fully control a mouse pointer. An end-product design was in mind, so the system consists of both a pair of custom 3D printed glasses and a wireless receiver; thus avoiding the need to be tethered to the computer under control . The horizontal position of the mouse pointer is controlled via the infrared eye tracking mechanism, consisting of an Infrared LED positioned above the eye and two phototransistors located on each side of the eye. The measured analog data from the phototransistors determine the eye’s horizontal position. The vertical movement of the mouse pointer is controlled with the help of a 3-axis gyroscope mounted to the glasses. The effectiveness of a simple infrared LED/phototransistor to detect eye movement is impressive, because similar projects we’ve seen have been camera based. We understand how final project deadlines can be, so we hope [Michael and John] continue past the deadline with this one. It would be great to see if the absolute position (horizontal and vertical) of the eye can be tracked entirely with the phototransistor technique.
Continue reading “PhotoTransistor Based Eye-Tracking”
Eye tracking is a really cool technology used in dozens of fields ranging from linguistics, human-computer interaction, and marketing. With a proper eye tracking setup, it’s possible for a web developer to see if their changes to the layout are effective, to measure how fast someone reads a page of text, and even diagnose medical disorders. Eye tracking setups haven’t been cheap, though, at least until now. Pupil is a serious, research-quality eye tracking headset designed by [Moritz] and [William] for their thesis at MIT.
The basic idea behind Pupil is to put one digital camera facing the user’s eye while another camera looks out on the world. After calibrating the included software, the headset looks at the user’s pupil to determine where they’re actually looking.
The hardware isn’t specialized at all – just a pair of $20 USB webcams, a LED, an infrared filter made from exposed 35mm film negatives, and a 3D printed headset conveniently for sale at Shapeways.
The software for Pupil is based on OpenCV and OpenGL and is available for Mac and Linux. Calibration is easy, as seen in the videos after the break, and the results are amazing for an eye tracking headset thrown together for under $100.
Continue reading “Build an eye tracking headset for $90″
This is a great hack, and it’s an advertisement. We wish this were the norm when it comes to advertising because they’ve really got our number. Skittles enlisted a few engineers to build a web interface that moves robot-powered candies.
When we started looking into this we figured that a few robots were covered with over-sized cases that looked like Skittles. But that’s not it at all. What you see above is actually upside down. The top side of the white surface has one tiny wheeled robot for each candy. A magnet was embedded in each Skittle which holds it to the underside of the surface. The user interface was rolled out on a Facebook page. It uses a common webcam for eye tracking. When you move your eyes, the robot controlling your assigned candy moves in that direction. See for yourself in the cllip after the break.
So we say bravo Mars Inc. We love it that you decided to show off what’s behind to curtain. As with the Hyundai pixel wall, there’s a whole subset of people who might ignore the ad, but will spend a lot of time to find out how it was done.
Continue reading “Webcam eye-tracking moves robot-powered Skittles candy”
[Bruce Land], professor at Cornell, is a frequent submitter to our tip line. Usually he sends in a few links every semester from undergraduate electronics courses. Now the fall semester is finally over and it’s time to move on to the more ambitious master’s projects.
First up is a head-mounted eye tracker, [Anil Ram Viswanathan] and [Zelan Xiao] put together a lightweight and low-cost eye tracking project that will record where the user is looking.
The eye tracker hardware is made of two cameras mounted on a helmet. The first camera faces forward, looking at the same thing the user is. The second camera is directed towards the user’s eye. A series of algorithms detect the iris of the user’s eye and overlays the expected gaze position on the output of the first camera. Here’s the design report. PDF, natch.
Next up is a face tracking project implemented on an FPGA. This project started out as a software implementation of a face tracking algorithm in MATLAB. [Thu-Thao Nguyen] translated this MATLAB code to Verilog and eventually got her hardware running on an FPGA dev board. Another design report.
Having a face detection and tracking system running on an FPGA is extremely interesting; the FPGA makes face tracking a very low power and hopefully lower-cost solution, allowing it to be used in portable and consumer devices.
You can check out the videos for these projects after the break.
Continue reading “Two computer vision builds from Cornell”
[John] has been working on a video-based eye tracking solution using OpenCV, and we’re loving the progress. [John]’s pupil tracking software can tell anyone exactly where you’re looking and allows for free head movement.
The basic idea behind this build is simple; when looking straight ahead a pupil is perfectly circular. When an eye looks off to one side, a pupil looks more and more like an ellipse to a screen-mounted video camera. By measuring the dimensions of this ellipse, [John]’s software can make a very good guess where the eye is looking. If you want the extremely technical breakdown, here’s an ACM paper going over the technique.
Like the EyeWriter project this build was based on, [John]’s build uses IR LEDs around the edge of a monitor to increase the contrast between the pupil and the iris.
After the break are two videos showing the eyetracker in action. Watching [John]’s project at work is a little creepy, but the good news is a proper eye tracking setup doesn’t require the user to stare at their eye.
Continue reading “OpenCV knows where you’re looking with eye tracking”
[Luis Cruz] is a Honduran High School student, and he built an amazing electrooculography system, and the writeup (PDF warning) of the project is one of the best we’ve seen.
[Luis] goes through the theory of the electrooculogram – the human eye is polarized from front to back because of a negative charge in the nerve endings in the retina. Because of this minute difference in charge, a user’s gaze can be tracked by electrodes attached to the skin around the eye. After connecting eye electrodes to opamps and a microcontroller, [Luis] imported the data with a Python script and wrote an “eyeboard” application to enable text input using only eye movement. The original goal of the project was to build an interface for severely disabled people, but [Luis] sees applications for sleep research and gathering marketing data.
We covered [Luis]’ homebrew 8-bit console last year, and he’s now controlling his Pong clone with his eye-tracking device. We’re reminded of a similar system developed by Atari, but [Luis]’ system uses a method that won’t give the user a headache after 15 minutes.
Check out [Luis] going through the capabilities of his interface after the break. Continue reading “Tracking eye movement by measuring electrons in the eye”