[Luis Cruz] is a Honduran High School student, and he built an amazing electrooculography system, and the writeup (PDF warning) of the project is one of the best we’ve seen.
[Luis] goes through the theory of the electrooculogram – the human eye is polarized from front to back because of a negative charge in the nerve endings in the retina. Because of this minute difference in charge, a user’s gaze can be tracked by electrodes attached to the skin around the eye. After connecting eye electrodes to opamps and a microcontroller, [Luis] imported the data with a Python script and wrote an “eyeboard” application to enable text input using only eye movement. The original goal of the project was to build an interface for severely disabled people, but [Luis] sees applications for sleep research and gathering marketing data.
We covered [Luis]’ homebrew 8-bit console last year, and he’s now controlling his Pong clone with his eye-tracking device. We’re reminded of a similar system developed by Atari, but [Luis]’ system uses a method that won’t give the user a headache after 15 minutes.
Check out [Luis] going through the capabilities of his interface after the break. Continue reading “Tracking eye movement by measuring electrons in the eye”
The folks at Waterloo Labs have delivered a quite amusing project where they built a system to control Mario with eye movements. Unlike the other eye movement systems we’ve seen that use imaging to detect where you are looking, this one is using electrodes on muscles in your face. Not only do they supply a fairly amusing video, they also have a pretty good amount of detail on the project site. Be sure to click on the links in the “additional resources” section at the bottom if you want hardware and software details on the build. The last time we saw these folks, they were using real guns to control video games.
The EyeWriter is an open source eye tracking initiative. This is the mature version of the KanEye project we covered in April. Collaboratively developed by Free Art and Technology (FAT), OpenFrameworks, and the Graffiti Research Lab, they seek to aid a friend of theirs who suffers from the degenerative muscle disease ALS.
They’ve come a long way since we last looked in on the progress. The hardware used is pretty much the same: a set of sunglasses sans lenses with the CCD from a Sony PlayStation 3 Eye mounted in front of one eye. IR LEDs surround the CCD and point at the eye to increase the contrast between pupil and the rest of the eye. The major improvement comes with the software. Eye tracking appears to be extremely precise and they’ve written a custom drawing program to take advantage of their interface. Check in on their developer page for source code and a video walk-through of the software.
After the break you can see video of [Tempt1] using the system to create some tags. We’re thankful for the success this project has seen as this guy can do a lot better with his eye than we can with our hands.
Continue reading “EyeWriter is the fruit of the KanEye project”
Generally when tracking eye movement we use various methods that require sensors being pointed at the eye itself. This approach is quite different in that it is sensing the “electrical potential of the cornea”. We have no idea how this works, but it looks pretty cool.
[Tempt One] is a graffiti artist who has Lou Gehrig’s disease. He can no longer physically produce art, since his disease has taken his ability to control his arms. His friends, won’t let that be the end of it though. They’re building a visual tracking system to let him work by moving his eye. It seems like it would be very difficult to get any kind of a smooth curve out of eye movement, but the short demonstration video, which you can see after the break, does a decent job, at least for something this early in development. The source code isn’t released yet, but they do plan to do so. If you wanted to make your own, you could find some info in a past post of ours. We’re guessing they intend to use it with something along the lines of the laser tagging system.
Continue reading “KanEye tracking system preview”
Controlling a robot simply by looking at your desired location is pretty freaking awesome. A web camera pointed at your face, analyzes your movements and pupil direction to send the bot signals. Look at a location and the bot goes, change your expression to send other commands to the bot. This easily surpasses the laser guided assistance droid for ease of use.