Digging Deep Into The Neato’s LIDAR Module

[Hash] is going to great lengths to learn about the parts used in his Neato XV-11 LIDAR (dead link; Internet Archive). We looked in on his work with the XV-11 platform recently, where he used the dust bin of the vacuum as a modular hardware housing. This hack is a hardware exploration aimed at figuring out how an equivalent open hardware version can be built.

The LIDAR module is made of two big chunks; the laser and optic assembly, and the sensor board seen above. [Hash] put it under the microscope for a better look at the line scan imager. The magnification helped him find the company name on the die, this particular part is manufactured by Panavision. He figured out the actual model by counting the bonding wires and pixels in between them to get a pretty good guess at the resolution. He’s pretty sure it’s a DLIS-2K and links to an app note and the datasheet in his post. The chip to the right of the sensor is a TI digital signal processor.

Putting it back together may prove difficult because it will be impossible to realign the optics exactly as they were–the module will need to be recalibrated. [Hash] plans to investigate how the calibration routines work and he’ll post anything that he finds. Check out his description of the tear down in the video after the break.

25 thoughts on “Digging Deep Into The Neato’s LIDAR Module

  1. Darn this means it is not a true lidar – but just a cmos imager – if the laser is offset from the imager then depending on how far away the object is from the sensor – the reflected spot will move across field of view of the sensor – still works almost as good as LIDAR – but a lot cheaper. (Basically this uses the same technique as those 3d scanners that we see on HAD using a webcam and an offset laser)

    We should be able to build these for even less!

    1. It’s still a “true” LIDAR, because a LIDAR is just “Light Detection And Ranging.” And now I’m just trolling…

      But I wonder, at that price point and accuracy will it be simpler to use something like a Kinect? No need for a scanning laser (or any laser), you get 3D and RGB data (even mic input for sonar?), and open source and official drivers exist already. It would be cool if someone could list the pro’s and con’s of both options.

      1. Medix,

        TOmato, toMATO… It reports the distance around 360 degrees. Doesn’t really matter if it uses time-of-flight, triangulation, or fairy dust. End result is measurements you can use for mapping.

      2. pro for kinect: already available for cheap and hackable.
        Con: horrible resolution, uses similar parallax measurement, but with cheap cam and dot pattern.
        The neato lidar can divide its measuring range over 2k (or 4k with subsampling) pixels, giving centimeter/millimeter range precision. You should be very happy distinguish less then 10 cm on a kinect reliably.

  2. From what I have seen the Kinect is pretty cool but it is not a microcontroller friendly sensor. Uses a lot of power and basically needs a full on laptop to make use of it. All that 3D and RGB data is nice but very processor intensive if you intend to do anything with it real-time.

    On the other hand a 360 degree LIDAR can output data an Arduino or similar processor could use to navigate around obstacles. So I would say it depends on what you are trying to accomplish. I may be bias though!

    For the record I also have a Kinect, just haven’t taken it out of the box yet…

      1. Meatman,

        I agree, processing power is almost free. The Turtlebot that runs ROS uses a small netbook and Kinect. But dig deeper and they basically use a single horizontal line of data and turned it into a single line LIDAR so they can process the data.

        Takes a lot of design work to make use of all the Kinect data.

  3. You know… they already have an academic paper on their laser design from ICRA 2008: A Low-Cost Laser Distance Sensor. It’s a sub-$30 design that uses a registered laser and CMOS imager to triangulate distances.

    There’s no need to reverse engineer the sensor, unless of course you want to marvel in design decisions involved in making a mass-market product — in which case, carry on.

    1. Travis,

      That is an interesting paper, but much changed in their design and implementation from that paper. The more you dig, the more you realize a lot of assumptions made in that paper were changed for a real world design.

      Kind of the difference between reading a book in school, and actually doing something for real.

      1. Hash,

        Can you elaborate on the design and implementation changes that you’ve observed? Obviously, the fabrication is much lower cost (injection molded instead of 3D printed and metal machining), better tolerances, superior alignment and calibration, cheaper BOM, etc. Perhaps you could ping me on gmail (beambot) or at Hizook? I’m crazy-interested in this space.

        Anyway, I personally know some of the early Neato folks, including the individual that helped fit the SLAM algorithm on a microcontroller. Their robot is definitely a cool, but I’m bummed that (1) it hasn’t been more successful and (2) they have yet to sell (to my knowledge) the stand-alone laser rangefinders. I guess they’re too busy trying to get market share on their primary product.

    2. Travis,

      I’m on the trossen robotics forum quite a bit and very interested in SLAM as well. I figured Neato will probably never sell a stand alone unit so my motivation was to replicate their production design.

      Reading the $30 sensor paper led me to believe they used a standard CMOS imager with a horizontal resolution of about 700 pixels. I was amazed to find a much better sensor used which probably cost MORE than their initial design ideas. If you’re interested in helping with an open source design let me know, would be cool to see a kit for sale on the Internet for under $75…

      1. Hash,

        I know that it has been a while since anyone has commented on this post, but I was looking around online and I happened across this post. I was wondering if you have made any progress on this project.

        Thanks!

        Nathan

Leave a Reply to HashCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.