KanEye Tracking System Preview

[vimeo=3952932]

[Tempt One] is a graffiti artist who has Lou Gehrig’s disease. He can no longer physically produce art, since his disease has taken his ability to control his arms. His friends, won’t let that be the end of it though. They’re building a visual tracking system to let him work by moving his eye. It seems like it would be very difficult to get any kind of a smooth curve out of eye movement, but the short demonstration video, which you can see after the break, does a decent job, at least for something this early in development. The source code isn’t released yet, but they do plan to do so.  If you wanted to make your own, you could find some info in a past post of ours. We’re guessing they intend to use it with something along the lines of the laser tagging system.

[vimeo=3928435]

23 thoughts on “KanEye Tracking System Preview

  1. YEAH thats amazing. I can’t articulate how awesome this is.

    What a gift. Why did no one try this before??? I bet Steven Hawking could improve his WPM output with something like this. Using something like the original Palm OS character input system I bet someone could reach maybe 20-30 WPM quickly.

    I have thought of the idea before but automatically dismissed it as impractical (probably because I have free use of my arms) particularly if you have visual feedback there should be little problem using the system. I can see that now (sorry it just came out that way).

  2. As I posted on their site:

    First off I have to say its great to see people helping out friends like this and that I hope it all works out fantastically!

    Just one suggestion though – rather than have one eye be forced to stare at a camera you could put a prism or something (like they use in aircraft HUDs) so that the camera is mounted facing down and the image it captures is reflected from the eye. This would allow your friend to literally look at where they want the effect to happen (through the prism) rather then estimating by using the other eye. You might have problems with resolving the eye or light levels though I guess. Just a thought

    I meants one of these:

    http://en.wikipedia.org/wiki/Beam_splitter

  3. Eye tracking isn’t new? It’s impressive to have implemented it this well. Not sure I’m a big fan of graffiti so couldnt care less if he couldnt do it again but the benefits of the hardware are obviously great.

    IIRC the Apache main gun turret is vision guided?

  4. This is actually pretty old idea with opensource available (type eyetracking opensource to google) Basis is: you have infrared LED which is doing reflection point in your eye not visible for you but visible for any webcam able to catch infrared. I used $20 MSI StarCam 370i 2 years ago.

    Reason why no one is manufacturing this excellent thing is exactly what I wrote. Infrared LED hitting your eye. You cannot see it but your brain does (iris shrinks) and since it’s hitting only one eye it causes headache after only minutes. Plus no one is sure what consequences for your eyes are in long term.

    Big companies have non-contact eyetrcking which operates on some other principle – I think this is much more interesting area to eplore.

  5. Jeff: I type with my nose, who needs this..

    Jeff, I’m a software developer and moving with mouse to the point where click is needed is eating literally 50% of my time. I with there were eyetracking solution moving mouse pointer to the place I’m looking at currently. Would save me hours.

  6. Dude thats inspirational regardless. I just lost feeling in my hand and trying to get back on my guitar. When you love something you fight for it and if they hit and miss, the beauty is in the attempt.

  7. hook this up with a laser and I could actually relive my childhood fantasies of sitting in a moving car and cutting a line through the countryside with my laser beam eyes.

  8. “I bet Steven Hawking could improve his WPM output with something like this”

    Actually, that is how he uses his speech program now. His form of MS has gotten much much worse, to the point where he can’t click his ‘input clicker’ anymore.

    It’s truly sad that some of the most talented people in the world are afflicted with debilitating afflictions such as this.

  9. This is awesome. I remember when the original post for the OpenEyes project came up on hackaday- since then, I’ve been wondering if anyone else did it, I seriously want to.

    As some here are claiming they already did this, I have a couple questions- is there any way to “underclock” the ir LED, for minimal power needed, so eye strain/long term damage can be turned down? Perhaps a momentary switch to the LED as well, hold in only when using the capability?

    Last, my idea uses eye tracking to control servos- how could this be implemented? Do you need a laptop to make use of this, or could a program running from something like a gumstick computer do the trick?

    Finally- idea- why not, instead of using IR, put on a custom contact lens with a calibration ring on it in bright white- have the camera lock on and calibrate to that, or a grid pattern. That way, no IR needed! Could that work?

    Excuse me if the questions are stupid- I can make anything, but programing is beyond me so far.

  10. drew:

    I don’t have experience in this area but here are my ideas.

    The notion that we can’t see IR but the brain can is utter bs. If the brain could see it, we would be able to see it too. However, IR can still damage our eyes. An important question would be, how much can we reduce the power but still be able to pick up the signal using cheap hardware (e.g., webcams).

    Your idea about only flashing the LED occasionally is a good one. IIRC, LEDs have 2-4ms response times. It should be possible to use pulse-width modulation to reduce power output but still have a signal that is visible to a webcam operating at 30hz.

    Eye tracking could probably be accomplished using embedded hardware (like a gumstix computer) that could then interface a server.

    But one issue that hasn’t really been mentioned much is calibration. I’m pretty sure that you would need to do some calibration each time you moved your head relative to the IR emitter and webcam (i.e., every time you adjusted the glasses or put the glasses on). To do that, you would probably need to have the user look at points on a fixed surface.

    Another issue I just thought of is that the shape of the eye effects the relationship between how it moves and what it’s focused on.

    Would colored or shiny contact lenses help out? Sure but contact lenses can move around on the eye, they don’t always stay fixed.

  11. @andrew

    LED’s have *MUCH* faster response times than 2-4ms. You can easily dim them using PWM, it’s very common and works well. You can also dim LEDs using a constant current power source (and lowering thew output to dim it), and LED driver chips usually work off that principle.

    Also a general note to people: Cheap webcams work great for seeing IR but you need to disassemble them and remove the little glass filter over the lens. They will usually see some IR without doing that but its *much* much better if you remove the filter.

    I’ve used a cheap microsoft camera i got at frys for $20 and it works great without the filter.

    Also remember that the eye is exposed to loads of IR light from the sun normally, though the pupils dilate in that situation so this may be worse, not sure.
    -Taylor

  12. @ Andrew & Taylor-

    Thanks, it’s nice to see i’m not nuts! In the case of the gentleman here, my idea to use a momentary to control the IR could be with a simple mouth switch- a pressure pad he can bite down on since his arms no longer work. With an underclocked IR LED, this would also save his eyes from unnecessary damage.

    @ werd- thanks for that link. It’s nice to see it’s been done, but that setup seems far too complex for what I need. My main concern is how the servos would actually be linked with openeyes software- as far as I understood, the original stuff that came up didn’t support servo controls in any way. I’m trying to figure out how the eyetracking software could be linked to a servo control interface- my guess is it’s a software issue beyond me. I need to make friends with some CMU students near me, maybe someone there could help. Software & even simple arduino programing is something beyond me!

    @ Andrew- yes, I see now the contact lens grid idea wouldn’t work, since the shape of the eye and movement of the lens would negate calibration.

    Final question I’ll throw out- can a laser point directed by a pointer be homed in by another laser mounted to a servo? ie: I point with a laser pointer, and a servo mounted laser with a camera facing out (a reverse eye tracker?) centers it’s laser on that laser point? I’m designing something for targeting an airsoft gun.
    Maybe I’m nuts? I’ve seen programs that track the brightest point on a screen, and thought it could be narrowed to only track a green laser point.

Leave a Reply to captaincongoCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.