DIY Eye Tracking For VR Headsets, From A To Z

Eye tracking is a useful feature in social virtual reality (VR) spaces because it really enhances presence and communication when one’s avatar has a realistic gaze. Most headsets lack this feature, but EyeTrackVR has a completely open source solution ready for anyone willing to put it together.

Camera is visible in lower right corner.

EyeTrackVR is a combination of hardware, software, and 3D printable mounts for attaching a pair of microcontroller boards, cameras, and IR LEDs to just about any existing VR headset out there. An ESP32-based board and tiny camera module watches each eyeball, and under IR illumination the pupil presents as an easily-identified round black area. Software takes care of turning the camera’s view of the pupil into a gaze direction value that can be plugged into other software.

The project is still under active development, but in its current state is perfectly suitable for creating a functional system that can integrate into a variety of existing headsets with printed mounting brackets. Interested? Check out the intro and if it sounds up your alley, dive into the build guide which spells out everything you need to know. Check out the video below for a demo of EyeTrackVR working in VRChat, along with an overview of software support.

We’ve seen headsets built to custom specs that integrate eye tracking, but even if one is repackaging an existing headset that’s a perfect opportunity to include this feature.

Continue reading “DIY Eye Tracking For VR Headsets, From A To Z”

Hackaday Prize 2023: Eye Tracking On A Budget

There is a lot to be learned from the experience of building something functional, and even better if doing so doesn’t break the bank. [Sergej Stoetzer]’s 20€ DIY-Eyetracker aims to be an educational process that covers everything from hardware to functional software in an accessible way.

Hardware based on an economical USB endoscope, and can be used as-is or repackaged with IR illumination.

The eye tracker is based on an economical USB endoscope, which is a small camera optimized for up-close applications. By attaching the camera to a pair of common safety glasses so that it looks at one’s eye, some OpenCV and Python code can do simple tracking and interfacing with other projects.

Basic eye tracking — like determining whether a user is looking up, down, left, or right — can be all that’s needed depending on one’s application. That means that it’s possible to get something working with very little hardware and some easy-to-use OpenCV functions.

Even better performance can be had by adding IR illumination and repackaging the camera into a 3D printed enclosure. The pupil of the eye is an aperture in the iris that appears as a black circle, and that’s even more true under IR illumination which is invisible to the naked eye. If you’re curious about what’s inside those USB endoscope cameras and how to remove their IR filter, there are some good pictures of that process in this project.

The ability to get something prototyped quickly and working well enough to learn new things is a valuable skill, and that’s why re-engineering Education is one of the challenges in the 2023 Hackaday Prize.

Low-Cost Eye Tracking With Webcams And Open-Source Software

“What are you looking at?” Said the wrong way, those can be fighting words. But in fields as diverse as psychological research and user experience testing, knowing what people are looking at in real-time can be invaluable. Eye-tracking software does this, but generally at a cost that keeps it out of the hands of the home gamer.

Or it used to. With hacked $20 webcams, this open source eye tracker will let you watch how someone is processing what they see. But [John Evans]’ Hackaday Prize entry is more than that. Most of the detail is in the video below, a good chunk of which [John] uses to extol the virtues of the camera he uses for his eye tracker, a Logitech C270. And rightly so — the cheap and easily sourced camera has remarkable macro capabilities right out of the box, a key feature for a camera that’s going to be trained on an eyeball a few millimeters away. Still, [John] provides STL files for mounts that snap to the torn-down camera PCB, in case other focal lengths are needed.

The meat of the project is his Jevons Camera Viewer, an app he wrote to control and view two cameras at once. Originally for a pick and place, the software can be used to coordinate the views of two goggle-mounted cameras, one looking out and one focused on the user’s eye. Reflections from the camera LED are picked up and used to judge the angle of the eye, with an overlay applied to the other camera’s view to show where the user is looking. It seems quite accurate, and plenty fast to boot.

We think this is a great project, like so many others in the first round of the 2018 Hackaday Prize. Can you think of an awesome project based on eye tracking? Here’s your chance to get going on the cheap.

Continue reading “Low-Cost Eye Tracking With Webcams And Open-Source Software”

Eye Tracking Makes The Musical Eye Conductor For Everyone!

For his final project at the Copenhagen Institute of Interaction Design, [Andreas Refsgaard] decided to make something that matters : a system that allows anyone to control a musical instrument using only their eyes and facial expressions. Someone should enter this into a certain contest that’s running…

Dubbed the Eye Conductor, [Andreas] has created a highly customizable system that allows for a control interface that can be operated using only your eyes, and some facial expressions. Designed with the intent to allow everyone to enjoy playing music, [Andreas] user test the system at schools, housing communities for people with physical disabilities, and anyone he could find in a wheel chair. His intent is to continue the project so that all people can enjoy playing music.

The system is open, designed for inclusion and can be customised to fit the physical abilities of whoever is using it.

Continue reading “Eye Tracking Makes The Musical Eye Conductor For Everyone!”

Hackaday Prize Entry: DIY Eye Tracking

Deep in the dark recesses of Internet advertisers and production studios, there’s a very, very strange device. It fits on a user’s head, has a camera pointing out, but also a camera pointing back at the user. That extra camera is aimed right at the eye. This is a gaze tracking system, a wearable robot that looks you in the eye and can tell you exactly what you were looking at. It’s exactly what you need to tell if people are actually looking at ads.

For their Hackaday Prize entry, Makeroni Labs is building an open source eye tracking system. It’s exactly what you want if you want to test how ‘sticky’ a webpage is, or – since we’d like to see people do something useful with their Hackaday Prize projects – for handicapped people that can not control their surroundings.

There are really only a few things you need to build an eye tracking camera – a pair of cameras and a bit of software. The cameras are just webcams, with the IR filters removed and a few IR LEDs aimed at the eye so the eye-facing camera can see the pupil. The second camera is pointed directly ahead, and with a bit of tricky math, the software can figure out where the user is looking.

The electronics are rather interesting, with all the processing running on a VoCore It’s Linux, though, and hopefully it’ll be fast enough to capture two video streams, calculate where the pupil is looking, and send another video stream out. As far as the rest of the build goes, the team is 3D printing everything and plans to make the design available to everyone. That’s great for experimentations in gaze tracking, and an awesome technology for the people who really need it.

The 2015 Hackaday Prize is sponsored by:

Eye Tracking With The Oculus Rift

ocu

There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.

Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.

To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.

Yes, using a computer to know where you’re looking may seem like a rather useless build, but stuff like this is used in research and extraordinarily high tech heads up displays. Although he’s not using the motion tracking on the Oculus, if [Diako] were to do so, he’d have the makings of one of the most powerful heads up displays possible.

Continue reading “Eye Tracking With The Oculus Rift”

Build An Eye Tracking Headset For $90

per_white_lines_faint_800

Eye tracking is a really cool technology used in dozens of fields ranging from linguistics, human-computer interaction, and marketing. With a proper eye tracking setup, it’s possible for a web developer to see if their changes to the layout are effective, to measure how fast someone reads a page of text, and even diagnose medical disorders. Eye tracking setups haven’t been cheap, though, at least until now. Pupil is a serious, research-quality eye tracking headset designed by [Moritz] and [William] for their thesis at MIT.

The basic idea behind Pupil is to put one digital camera facing the user’s eye while another camera looks out on the world. After calibrating the included software, the headset looks at the user’s pupil to determine where they’re actually looking.

The hardware isn’t specialized at all – just a pair of $20 USB webcams, a LED, an infrared filter made from exposed 35mm film negatives, and a 3D printed headset conveniently for sale at Shapeways.

The software for Pupil is based on OpenCV and OpenGL and is available for Mac and Linux. Calibration is easy, as seen in the videos after the break, and the results are amazing for an eye tracking headset thrown together for under $100.

Continue reading “Build An Eye Tracking Headset For $90”