Controlling a rover with your eyes

Controlling a robot simply by looking at your desired location is pretty freaking awesome. A web camera pointed at your face, analyzes your movements and pupil direction to send the bot signals. Look at a location and the bot goes, change your expression to send other commands to the bot. This easily surpasses the laser guided assistance droid for ease of use.

[via BotJunkie]


  1. macegr says:

    This is not a good system to have in operation near any attractive women.

  2. Bryan says:

    I see no webcam…did i miss something? the point is that the man sitting in front of the monitor looks at an edge of the monitor the steer the drone, but i see no kind of camera watching him…? what did i miss

  3. macegr says:

    bryan: they were using an off the shelf eye tracker, it’s that bar mounted below the monitor. You can see spots from the IR emitters in this video. They say their *next* version will use a webcam to track the eyes.

  4. ribblem says:

    I wouldn’t say that this “easily surpasses the laser guided assistance droid for ease of use.” I think both are probably pretty easy to use and both have use cases where they excel.

  5. Dan says:

    if you look close at the screen he is looking at, you can see that it is controlled by a mouse. There for it is not controlled by his eyes.

  6. Dan says:

    actually i guess it could be his eyes controlling the mouse.

  7. omikun says:

    They really leveraged existing technology to do it all in just 2 hours. I wonder for what part did they use the lego parts?

  8. CC says:

    There would appear to be sme sort of sensor array at the bottom of the screen. The webcam is used on the “robot” itself.

  9. bob says:

    Damn it, I’ve been meaning to do this for ages. Seems a lot more practical than shining lasers into one’s pupils.

  10. Dennis says:

    They are not using a webcam, they are using special eytracker hardware. Possibly one made by Tobii (

  11. martin says:

    the eye tracker comes from SensoMotoric Instruments, see

    more information on the project can be found here:

    and gaze interaction in general,

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

Join 97,706 other followers