Deep in the dark recesses of Internet advertisers and production studios, there’s a very, very strange device. It fits on a user’s head, has a camera pointing out, but also a camera pointing back at the user. That extra camera is aimed right at the eye. This is a gaze tracking system, a wearable robot that looks you in the eye and can tell you exactly what you were looking at. It’s exactly what you need to tell if people are actually looking at ads.
For their Hackaday Prize entry, Makeroni Labs is building an open source eye tracking system. It’s exactly what you want if you want to test how ‘sticky’ a webpage is, or – since we’d like to see people do something useful with their Hackaday Prize projects – for handicapped people that can not control their surroundings.
There are really only a few things you need to build an eye tracking camera – a pair of cameras and a bit of software. The cameras are just webcams, with the IR filters removed and a few IR LEDs aimed at the eye so the eye-facing camera can see the pupil. The second camera is pointed directly ahead, and with a bit of tricky math, the software can figure out where the user is looking.
The electronics are rather interesting, with all the processing running on a VoCore It’s Linux, though, and hopefully it’ll be fast enough to capture two video streams, calculate where the pupil is looking, and send another video stream out. As far as the rest of the build goes, the team is 3D printing everything and plans to make the design available to everyone. That’s great for experimentations in gaze tracking, and an awesome technology for the people who really need it.
Eye tracking is trivial (all you need is IR and you have nice black area to detect) and it’s only small part of the job. I’ve developed this kind of tracking and have used Tobii eye tracking. Both are great if you really really need to use them (disabled) but eye tracking in general is a bad idea; eyes are not meant to be used for pointing things and they get really tired fast. I’ve experienced it on my own eyes :)
When you’re using a headset-style tracker you also need to know where the head is and where it is pointing and then this method comes really interestingly difficult. It has been done many times before but it’s still not trouble free.
Sometimes it feels like things are developed technology-ahead (3d printing!!! wow! small computers!! wow) rather than focusing on the usage problem.
Uhh, not quite following how your eyes “tire out” by looking at things all day. That’s sort of what eye’s, well, do all day?
your eyes don’t move as much as you might think they do, and you’re moving your head too and not so much of extreme pupil movement so when you’re using them like a mouse or such, its very different to the norm. even in Vr apps people only move around a few 10s of degrees from start.
Small computers are useful for when you need a small computer in a small space.
3D printers are useful for creating prototypes and objects from 3D models.
Eye tracking is useful for,. among other things, HMI for disabled people and advertising research.
What exactly are you trying to say?
Why are jaded, jilted, and unwilling to face reality, but are perfectly willing to lash out at pursuits you do not understand?
Eye tracking is NOT trivial- It requires a complex apparatus and to work well must be able to account for both human-machine issues of comfort as well as technological issues of changing environment, size, cost, and not least of all, the actual computer vision that tracks the pupil – and attendant math to unscramble it into something useful.
You have absolutely no idea what you are talking about, do you?
I recall reading about this technology back in the mid 70s in I think a Popular Mechanics ‘Whats New’ column. A racing driver wore a helmet fitted with a macro TV camera that watched the reflection of the track and surrounds on his eyeball, and where the pupil was, allowing researchers to see exactly what he was focusing on. I thought it rather ingenious at the time.
slight difference:
eye tracking takes and image identifies the location of eyes.
gaze tracking takes the image of an eye and calculates where it’s looking.
the vocore’s neat, nice to see people using it. hopefully it’ll help improve it.