Robotic Motion Sensing Using An Optical Mouse

optical mouse

We’ve had fun with the sensors in optical mice before, but [Mac Cody] wrote in to tell us about his legitimate application of the technology. First, he disassembled the mouse and bypassed the on-board controller. He then wired the clock and data lines to a Harris RTXEB single board computer. It’s based around a Harris RTX2001A microcontroller which he programmed in Forth to talk to the Agilent optical mouse sensor. Documented code is provided in case you want to implement it in a different language. His future plans for the system are to roll it into some robot projects for dead reckoning navigation.

19 thoughts on “Robotic Motion Sensing Using An Optical Mouse

  1. this is a lot of work, and for some reason i hate to see a lot of work done for no reason. first of all, the optical mouse in question was ps/2 interface. the ps/2 interface is incredibly easy to use with any microcontroller and the mouse data is standardized. so he ignored a readily-available interface with accurate motion data already encoded. now he’s using an expensive sbc and doesn’t even have position sensing algorithms worked out, which the onboard mouse controller handles just fine. in addition he’s using the parallel port with software control and it’s not going to be realtime sensing like the mouse image processor can do. so, for this hack: thumbs up on getting an interface to a tiny camera, thumbs down on getting a practical position encoder.

  2. cbm5, it was not that much work, actually, to access the mouse sensor directly. I ignored the ps/2 interface for a good reason. I didn’t want to implement code to access an interface that gave limited functionality (x-motion and y-motion only). Controlling the ADNS-2610 directly allows access to the many functions on the device (device reset, power down, forced awake, surface quality, min/max pixel values, individual pixel values, and shutter rate). The hack presented in my photoessay is for proof-of-concept and general experimentation with my SBC that is part of my robot (check out my website). No PC is used in this, so your complaints are invalid. I’m in the process of taking the hack to the next level and re-hosting the mouse sensor in a housing with a lens with a longer focal length. This will get the assembly further away from the target image (off the floor) than with an optical mouse.

    alex cd, sorry about the quality of the photos. I’m using a BTC PC-380 webcam that is hard to focus. Donations are gladly accepted! :)

  3. I’m not sure that “owned” is the best way to describe what happened here, i brought up some points that i surely wasn’t alone in considering, and mac cody defended his hack by clarifying the “why” of this hack. really, that was overlooked in the original writeup, any type of explanation why this was done and is better than the normal mouse interface for position encoding. i’m sure i’m not the only one who evaluates hacks based on whether the final result is worth the effort, and without the extra data provided by mac’s post, i would have put this hack in the “won’t be using this idea” file.

  4. cbm5, I, for one, certainly don’t feel like I “own” you. I was just making clarifications based upon your statements and preconceptions. In fact, I’m pretty much repeating the information that was provided in my original submission comments (plus an explanation of the advantage of the approach I took). Those comments didn’t get transcribed clearly in Eliot’s original post.

  5. A few years back I saw a very similar hack in a very clever context: a robotic hovercraft. Problem with robotic hovercraft is you don’t have wheels so you can’t do dead-reckoning that way. So they mounted two optical mice inside the skirt so that the craft could keep track of its position and orientation.

  6. So what kind of resolution do you get with the imaging sensor. Would it be possible to use it as a low-resolution camera, or do the optics make it unusable for targets less than a couple inches from the sensor.

    I’ve considered using an optical mouse as an encoder before, but I could never solve the mechanical problem of getting it right against the floor without snagging, dragging etc. How would you make this work on various floor types? (ie, carpet, tile, etc)

  7. heathkit, the ADNS-2610 has a 18×18-pixel imaging array, so its usefulness as a low-resolution camera would be limited. The ADNS-3060 has a 30×30-pixel array, so it may be more suitable for that purpose. The work of Trutna and Schumacher, found at http://www.contrib.andrew.cmu.edu/~ttrutna/16-264/Vision_Project/
    shows how they used that mouse sensor to image their surroundings.

    I think that a mouse sensor could be used as a proximity warning device by using its optical flow capabilities to detect motion within its field of view. Polling the x-delta and y-delta registers would indicate that something “significant” was moving. This would be a low-bandwidth operation. If needed, the robot would then check the mouse sensor image, perhaps comparing it to a static scene image. This might serve as a clue to the robot as to what actions to take (turn on a webcam, power up motors to run, etc). I call it a “roach eye”. A roach usually doesn’t react unless it perceives a threat that is big enough to do it harm.

    The lens system of a typical optical mouse has a very short focal length, hence requiring its close proximity to the viewing surface. The new optical sensor I’m working on has a new lens system that places the lens about 40 millimeters from the floor. This will solve the snaging and dragging issues you refer to. At least it will for the garage floor where my robot runs!

  8. hey!
    i am a novic ! would like to make a robot usins otical mouse sensor . but adns 2610 has little focal length ,how can i increse the length? using tths sensor?
    thank you

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.