Steve Evans Passes Away, Leaves An Inspiring Legacy

It is with great sadness that Hackaday learns of the passing of Steve Evans. He was one of the creators of Eyedrivomatic, the eye-controlled wheelchair project which was awarded the Grand Prize during the 2015 Hackaday Prize.

News of Steve’s passing was shared by his teammate Cody Barnes in a project update on Monday. For more than a decade Steve had been living with Motor Neurone Disease (MND). He slowly lost the function of his body, but his mind remained intact throughout. We are inspired that despite his struggles he chose to spend his time creating a better world. Above you can see him test-driving an Eyedrivomatic prototype which is the blue 3D printed attachment seen on the arm of his chair.

The Eyedrivomatic is a hardware adapter for electric wheelchairs which bridges the physical controls of the chair with the eye-controlled computer used by people living with ALS/MND and in many other situations. The project is Open Hardware and Open Source Software and the team continues to work on making Eyedriveomatic more widely available by continuing to refine the design for ease of fabrication, and has even begun to sell kits so those who cannot build it themselves still have access.

The team will continue with the Eyedrivomatic project. If you are inspired by Steve’s story, now is a great time to look into helping out. Contact Cody Barnes if you would like to contribute to the project. Love and appreciation for Steve and his family may be left as comments on the project log.

Hackaday Prize Entry: DIY Eye Tracking

Deep in the dark recesses of Internet advertisers and production studios, there’s a very, very strange device. It fits on a user’s head, has a camera pointing out, but also a camera pointing back at the user. That extra camera is aimed right at the eye. This is a gaze tracking system, a wearable robot that looks you in the eye and can tell you exactly what you were looking at. It’s exactly what you need to tell if people are actually looking at ads.

For their Hackaday Prize entry, Makeroni Labs is building an open source eye tracking system. It’s exactly what you want if you want to test how ‘sticky’ a webpage is, or – since we’d like to see people do something useful with their Hackaday Prize projects – for handicapped people that can not control their surroundings.

There are really only a few things you need to build an eye tracking camera – a pair of cameras and a bit of software. The cameras are just webcams, with the IR filters removed and a few IR LEDs aimed at the eye so the eye-facing camera can see the pupil. The second camera is pointed directly ahead, and with a bit of tricky math, the software can figure out where the user is looking.

The electronics are rather interesting, with all the processing running on a VoCore It’s Linux, though, and hopefully it’ll be fast enough to capture two video streams, calculate where the pupil is looking, and send another video stream out. As far as the rest of the build goes, the team is 3D printing everything and plans to make the design available to everyone. That’s great for experimentations in gaze tracking, and an awesome technology for the people who really need it.

The 2015 Hackaday Prize is sponsored by:

Eye Tracking With The Oculus Rift

ocu

There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.

Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.

To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.

Yes, using a computer to know where you’re looking may seem like a rather useless build, but stuff like this is used in research and extraordinarily high tech heads up displays. Although he’s not using the motion tracking on the Oculus, if [Diako] were to do so, he’d have the makings of one of the most powerful heads up displays possible.

Continue reading “Eye Tracking With The Oculus Rift”

Build An Eye Tracking Headset For $90

per_white_lines_faint_800

Eye tracking is a really cool technology used in dozens of fields ranging from linguistics, human-computer interaction, and marketing. With a proper eye tracking setup, it’s possible for a web developer to see if their changes to the layout are effective, to measure how fast someone reads a page of text, and even diagnose medical disorders. Eye tracking setups haven’t been cheap, though, at least until now. Pupil is a serious, research-quality eye tracking headset designed by [Moritz] and [William] for their thesis at MIT.

The basic idea behind Pupil is to put one digital camera facing the user’s eye while another camera looks out on the world. After calibrating the included software, the headset looks at the user’s pupil to determine where they’re actually looking.

The hardware isn’t specialized at all – just a pair of $20 USB webcams, a LED, an infrared filter made from exposed 35mm film negatives, and a 3D printed headset conveniently for sale at Shapeways.

The software for Pupil is based on OpenCV and OpenGL and is available for Mac and Linux. Calibration is easy, as seen in the videos after the break, and the results are amazing for an eye tracking headset thrown together for under $100.

Continue reading “Build An Eye Tracking Headset For $90”