OpenCV Knows Where You’re Looking With Eye Tracking

[John] has been working on a video-based eye tracking solution using OpenCV, and we’re loving the progress. [John]’s pupil tracking software can tell anyone exactly where you’re looking and allows for free head movement.

The basic idea behind this build is simple; when looking straight ahead a pupil is perfectly circular. When an eye looks off to one side, a pupil looks more and more like an ellipse to a screen-mounted video camera. By measuring the dimensions of this ellipse, [John]’s software can make a very good guess where the eye is looking. If you want the extremely technical breakdown, here’s an ACM paper going over the technique.

Like the EyeWriter project this build was based on, [John]’s build uses IR LEDs around the edge of a monitor to increase the contrast between the pupil and the iris.

After the break are two videos showing the eyetracker in action. Watching [John]’s project at work is a little creepy, but the good news is a proper eye tracking setup doesn’t require the user to stare at their eye.

Continue reading “OpenCV Knows Where You’re Looking With Eye Tracking”

KanEye Tracking System Preview

[vimeo=3952932]

[Tempt One] is a graffiti artist who has Lou Gehrig’s disease. He can no longer physically produce art, since his disease has taken his ability to control his arms. His friends, won’t let that be the end of it though. They’re building a visual tracking system to let him work by moving his eye. It seems like it would be very difficult to get any kind of a smooth curve out of eye movement, but the short demonstration video, which you can see after the break, does a decent job, at least for something this early in development. The source code isn’t released yet, but they do plan to do so.  If you wanted to make your own, you could find some info in a past post of ours. We’re guessing they intend to use it with something along the lines of the laser tagging system.

Continue reading “KanEye Tracking System Preview”

Hackaday Prize 2023: Eye-Tracking Wheelchair Interface Is A Big Help

For those with quadriplegia, electric wheelchairs with joystick controls aren’t much help. Typically, sip/puff controllers or eye-tracking solutions are used, but commercial versions can be expensive. [Dhruv Batra] has been experimenting with a DIY eye-tracking solution that can be readily integrated with conventional electric wheelchairs.

The system uses a regular webcam aimed at the user’s face. A Python script uses OpenCV and a homebrewed image segmentation algorithm to analyze the user’s eye position. The system is configured to stop the wheelchair when the user looks forward or up. Looking down commands the chair forward. Glancing left and right steers the chair in the given direction.

The Python script then sends the requisite commands via a TCP connection to an ESP32, which controls a bunch of servos to move the wheelchair’s joystick in the desired manner. This allows retrofitting the device on a wheelchair without having to modify it in an invasive manner.

It’s a neat idea, though it could likely benefit from some further development. A reverse feature would be particularly important, after all. However, it’s a great project that has likely taught [Dhruv] many important lessons about human-machine interfaces, particularly those beyond the ones we use every day. 

This project has a good lineage as well — a similar project, EyeDriveOMatic won the Hackaday prize back in 2015.

This Eyeball Watches You Thanks To Kinect Tracking

Eyeballs are often watching us, but they’re usually embedded in the skull of another human or animal. When they’re staring at you by themselves, they can be altogether more creepy. This Halloween project from [allpartscombined] aims to elicit that exact spooky vibe.

The project relies on a Kinect V2 to do body tracking. It feeds data to a Unity app that figures out how to aim the eyeball at any humans detected in the scene. The app sends angle data to an Arduino over serial, with the microcontroller generating the necessary signals to command servos which move the eyeball.

With tilt and pan servos fitted and the precision tracking from the Kinect data, the eye can be aimed at people  in two dimensions. It’s significantly spookier than simply panning the eye back and forth.

The build was actually created by modifying an earlier project to create an airsoft turret, something we’ve seen a few times around these parts. Fundamentally, the tracking part is the same, just in this case, the eye doesn’t shoot at people… yet! Video after the break.

Continue reading “This Eyeball Watches You Thanks To Kinect Tracking”

Eye-Tracking Device Is A Tiny Movie Theatre For Jumping Spiders

The eyes are windows into the mind, and this research into what jumping spiders look at and why required a clever device that performs eye tracking, but for jumping spiders. The eyesight of these fascinating creatures in some ways has a lot in common with humans. We both perceive a wide-angle region of lower visual fidelity, but are capable of directing our attention to areas of interest within that to see greater detail. Researchers have been able to perform eye-tracking on jumping spiders, literally showing exactly where they are looking in real-time, with the help of a custom device that works a little bit like a miniature movie theatre.

A harmless temporary adhesive on top (and a foam ball for a perch) holds a spider in front of a micro movie projector and IR camera. Spiders were not harmed in the research.

To do this, researchers had to get clever. The unblinking lenses of a spider’s two front-facing primary eyes do not move. Instead, to look at different things, the cone-shaped inside of the eye is shifted around by muscles. This effectively pulls the retina around to point towards different areas of interest. Spiders, whose primary eyes have boomerang-shaped retinas, have an X-shaped region of higher-resolution vision that the spider directs as needed.

So how does the spider eye tracker work? The spider perches on a tiny foam ball and is attached — the help of a harmless and temporary adhesive based on beeswax — to a small bristle. In this way, the spider is held stably in front of a video screen without otherwise being restrained. The spider is shown home movies while an IR camera picks up the reflection of IR off the retinas inside the spider’s two primary eyes. By superimposing the IR reflection onto the displayed video, it becomes possible to literally see exactly where the spider is looking at any given moment. This is similar in some ways to how eye tracking is done for humans, which also uses IR, but watches the position of the pupil.

In the short video embedded below, if you look closely you can see the two retinas make an X-shape of a faintly lighter color than the rest of the background. Watch the spider find and focus on the silhouette of a tasty cricket, but when a dark oval appears and grows larger (as it would look if it were getting closer) the spider’s gaze quickly snaps over to the potential threat.

Feel a need to know more about jumping spiders? This eye-tracking research was featured as part of a larger Science News article highlighting the deep sensory spectrum these fascinating creatures inhabit, most of which is completely inaccessible to humans.

Continue reading “Eye-Tracking Device Is A Tiny Movie Theatre For Jumping Spiders”

PhotoTransistor Based Eye-Tracking

eyetrack

The applications of eye-tracking devices are endless, which is why we always get excited to see new techniques in measuring the absolute position of the human eye. Cornell students [Michael and John] took on an interesting approach for their final project and designed a phototransistor based eye-tracking system.

We can definitely see the potential of this project, but for their first prototype, the system relies on both eye-tracking and head movement to fully control a mouse pointer. An end-product design was in mind, so the system consists of both a pair of custom 3D printed glasses and a wireless receiver; thus avoiding the need to be tethered to the computer under control . The horizontal position of the mouse pointer is controlled via the infrared eye tracking mechanism, consisting of an Infrared LED positioned above the eye and two phototransistors located on each side of the eye. The measured analog data from the phototransistors determine the eye’s horizontal position. The vertical movement of the mouse pointer is controlled with the help of a 3-axis gyroscope mounted to the glasses. The effectiveness of a simple infrared LED/phototransistor to detect eye movement is impressive, because similar projects we’ve seen have been camera based. We understand how final project deadlines can be, so we hope [Michael and John] continue past the deadline with this one. It would be great to see if the absolute position (horizontal and vertical) of the eye can be tracked entirely with the phototransistor technique.

Continue reading “PhotoTransistor Based Eye-Tracking”