Low-Cost Eye Tracking with Webcams and Open-Source Software

“What are you looking at?” Said the wrong way, those can be fighting words. But in fields as diverse as psychological research and user experience testing, knowing what people are looking at in real-time can be invaluable. Eye-tracking software does this, but generally at a cost that keeps it out of the hands of the home gamer.

Or it used to. With hacked $20 webcams, this open source eye tracker will let you watch how someone is processing what they see. But [John Evans]’ Hackaday Prize entry is more than that. Most of the detail is in the video below, a good chunk of which [John] uses to extol the virtues of the camera he uses for his eye tracker, a Logitech C270. And rightly so — the cheap and easily sourced camera has remarkable macro capabilities right out of the box, a key feature for a camera that’s going to be trained on an eyeball a few millimeters away. Still, [John] provides STL files for mounts that snap to the torn-down camera PCB, in case other focal lengths are needed.

The meat of the project is his Jevons Camera Viewer, an app he wrote to control and view two cameras at once. Originally for a pick and place, the software can be used to coordinate the views of two goggle-mounted cameras, one looking out and one focused on the user’s eye. Reflections from the camera LED are picked up and used to judge the angle of the eye, with an overlay applied to the other camera’s view to show where the user is looking. It seems quite accurate, and plenty fast to boot.

We think this is a great project, like so many others in the first round of the 2018 Hackaday Prize. Can you think of an awesome project based on eye tracking? Here’s your chance to get going on the cheap.

Continue reading “Low-Cost Eye Tracking with Webcams and Open-Source Software”

Redirected Walking in VR done via Exploit of Eyeballs

[Anjul Patney] and [Qi Sun] demonstrated a fascinating new technique at NVIDIA’s GPU Technology Conference (GTC) for tricking a human into thinking a VR space is larger than it actually is. The way it works is this: when a person walks around in VR, they invariably make turns. During these turns, it’s possible to fool the person into thinking they have pivoted more or less than they have actually physically turned. With a way to manipulate perception of turns comes a way for software to gently manipulate a person’s perception of how large a virtual space is. Unlike other methods that rely on visual distortions, this method is undetectable by the viewer.

Saccadic movements

The software essentially exploits a quirk of how our eyes work. When a human’s eyes move around to look at different things, the eyeballs don’t physically glide smoothly from point to point. The eyes make frequent but unpredictable darting movements called saccades. There are a number of deeply interesting things about saccades, but the important one here is the fact that our eyes essentially go offline during saccadic movement. Our vision is perceived as a smooth and unbroken stream, but that’s a result of the brain stitching visual information into a cohesive whole, and filling in blanks without us being aware of it.

Part one of [Anjul] and [Qi]’s method is to manipulate perception of a virtual area relative to actual physical area by making a person’s pivots not a 1:1 match. In VR, it may appear one has turned more or less than one has in the real world, and in this way the software can guide the physical motion while making it appear in VR as though nothing is amiss. But by itself, this isn’t enough. To make the mismatches imperceptible, the system watches the eye for saccades and times its adjustments to occur only while they are underway. The brain ignores what happens during saccadic movement, stitches together the rest, and there you have it: a method to gently steer a human being in a way that a virtual space is larger than the physical area available.

Embedded below is a video demonstration and overview, which mentions other methods of manipulating perception of space in VR and how it avoids the pitfalls of other methods.

Continue reading “Redirected Walking in VR done via Exploit of Eyeballs”

IoT Doorman: Eye-Controlled Door for a Girl with Cerebral Palsy

Kyleigh has an eye-controlled computer on her wheelchair but something as simple as her bedroom door was still beyond her reach… until now! [Bill Binko], recently filmed a demo of an automatic, IoT door opener built for the young girl with cerebral palsy. [Bill] is a co-founder of ATMakers, an organization that enables makers interested in assistive technologies to collaborate with users to improve quality of life.

Using her eye tracking tablet (PRC Device), Kyleigh has two new icons that make the relevant call to a website, pushing a simple command to either open or close her bedroom door. The device attached to the door uses an Adafruit M0 WiFi Feather board, a DC stepper motor and wheel, a UBEC buck converter, and a potentiometer.

Since other family members are also going to be opening and closing the door, there’s potentiometer which measures the door position for proper operation next time Kyleigh wishes to use the door. The installation also maintains a fairly inconspicuous profile for the assistance it gives — the ‘brain’ is enclosed in a small box on the door, with the motor only slightly larger on the door’s base.

[Bill] believes the project has a few quibbles and wants to work out a smaller wait before the open/close process is executed and optimizing the open/close speed. You have to check out the video below to see that it works really really. We’re also excited to see Kyleigh using her gaze control to talk to an Amazon Echo. [Bill] foresee a door control improvement that links it to Alexa. And how much did it cost to improve the quality of life for this young girl? $70.

We love seeing makers help people, and cannot wait to see what 2018 will bring! If you’re looking for more inspiration, don’t miss the eye-controlled wheelchair project called Eyedrivomatic which won the 2015 Hackaday Prize. There’s also the top Assistive Technology projects from the Hackaday Prize.

Continue reading “IoT Doorman: Eye-Controlled Door for a Girl with Cerebral Palsy”

Hackaday Prize Entry: Real Life XEyes

There’s a lot of tech that goes into animatronics, cosplay, and costumes. For their Hackaday Prize entry, [Dasaki] and [Dylan] are taking the eyes in a costume or Halloween prop to the next level with animatronic eyes that look where the wearer of this crazy confabulation is looking. It’s XEyes in real life, and it promises to be a part of some very, very cool costumes.

The mechanics of this system are actually pretty simple — it’s just a few servos joined together to make a pair of robotic eyes move up and down, and left to right. This entire mechanism is mounted on a frame, to which is attached a very small camera pointed directly at the user’s (real) eye. The software is where things get fun. That’s a basic eye-tracking setup, with IR light illuminating the pupil, and a compute unit that can calculate where the user is looking.

For the software, [Dasaki] and [Dylan] have collected a bunch of links, but right now the best solutions are the OpenMV and the Eye of Horus project from last year’s Hackaday Prize. It’s a great project, and a really fun entry for the Automation portion of this year’s Hackaday Prize.

Hackaday Prize Entry: DIY Eye Tracking

Deep in the dark recesses of Internet advertisers and production studios, there’s a very, very strange device. It fits on a user’s head, has a camera pointing out, but also a camera pointing back at the user. That extra camera is aimed right at the eye. This is a gaze tracking system, a wearable robot that looks you in the eye and can tell you exactly what you were looking at. It’s exactly what you need to tell if people are actually looking at ads.

For their Hackaday Prize entry, Makeroni Labs is building an open source eye tracking system. It’s exactly what you want if you want to test how ‘sticky’ a webpage is, or – since we’d like to see people do something useful with their Hackaday Prize projects – for handicapped people that can not control their surroundings.

There are really only a few things you need to build an eye tracking camera – a pair of cameras and a bit of software. The cameras are just webcams, with the IR filters removed and a few IR LEDs aimed at the eye so the eye-facing camera can see the pupil. The second camera is pointed directly ahead, and with a bit of tricky math, the software can figure out where the user is looking.

The electronics are rather interesting, with all the processing running on a VoCore It’s Linux, though, and hopefully it’ll be fast enough to capture two video streams, calculate where the pupil is looking, and send another video stream out. As far as the rest of the build goes, the team is 3D printing everything and plans to make the design available to everyone. That’s great for experimentations in gaze tracking, and an awesome technology for the people who really need it.

The 2015 Hackaday Prize is sponsored by:

Eye Tracking With The Oculus Rift

ocu

There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.

Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.

To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.

Yes, using a computer to know where you’re looking may seem like a rather useless build, but stuff like this is used in research and extraordinarily high tech heads up displays. Although he’s not using the motion tracking on the Oculus, if [Diako] were to do so, he’d have the makings of one of the most powerful heads up displays possible.

Continue reading “Eye Tracking With The Oculus Rift”

PhotoTransistor Based Eye-Tracking

eyetrack

The applications of eye-tracking devices are endless, which is why we always get excited to see new techniques in measuring the absolute position of the human eye. Cornell students [Michael and John] took on an interesting approach for their final project and designed a phototransistor based eye-tracking system.

We can definitely see the potential of this project, but for their first prototype, the system relies on both eye-tracking and head movement to fully control a mouse pointer. An end-product design was in mind, so the system consists of both a pair of custom 3D printed glasses and a wireless receiver; thus avoiding the need to be tethered to the computer under control . The horizontal position of the mouse pointer is controlled via the infrared eye tracking mechanism, consisting of an Infrared LED positioned above the eye and two phototransistors located on each side of the eye. The measured analog data from the phototransistors determine the eye’s horizontal position. The vertical movement of the mouse pointer is controlled with the help of a 3-axis gyroscope mounted to the glasses. The effectiveness of a simple infrared LED/phototransistor to detect eye movement is impressive, because similar projects we’ve seen have been camera based. We understand how final project deadlines can be, so we hope [Michael and John] continue past the deadline with this one. It would be great to see if the absolute position (horizontal and vertical) of the eye can be tracked entirely with the phototransistor technique.

Continue reading “PhotoTransistor Based Eye-Tracking”