Scanning On The Cheap

[Will] recently stumbled across the MakerBot Digitizer, a device that’s basically a webcam and a turntable that will turn a small object into a point cloud that can then be printed off on a MakerBotⓇ 3D printer. Or any other 3D printer, for that matter. The MakerBot Digitizer costs $800, and [Will] wondered if he could construct a cheaper 3D scanner with stuff sitting around his house. It turns out, he can get pretty close using only a computer, a webcam, and a Black and Decker line laser/level.

The build started off with a webcam mounted right next to the laser line level. Software consisted of Python using OpenCV, numpy, and matplotlib to grab images from the webcam. The software looks at each frame of video for the path of the laser shining against the object to be scanned. This line is then extracted into a 3D point cloud and reconstructed in MeshLab to produce a 3D object that might or might not be 3D printable.

This is only [Will]’s first attempt at creating a scanner. He’s not even using a turntable with this project – merely manually rotating the object one degree for 360 individual frames. It’s extremely tedious, and he’ll be working on incorporating a stepper motor in a future version.

This is only attempt number 1, but already [Will] has a passable scanned object created from a real-world thing.

20 thoughts on “Scanning On The Cheap

    1. (I was going to mention DAVID, but since you already did, I’ll make it a reply.)
      Nowadays, the “big thing” in 3D scanning is “structured light”. DAVID supports it, and you can even get open-source implementations. I’ve been wanting to make dedicated hardware that does it and just sends a 3D model to the PC/printer.
      There are two basic forms, and one of them can be done in real-time, if you have a fast enough projector and camera (either can be nearly real-time, even with normal, consumer-grade hardware).
      I don’t mean to take away from the laser/turntable setup. It’s a much cheaper method, and it can obviously produce decent results. 123D Catch is rubbish, though :P

      1. Ever noticed how the models scanned using 123D have the texture images stuck on the model? Nobody ever shows what the untextured model looks like.

        Another thing that bugs me are items posted on thingiverse whereby the image is a photo of the original item, and then you look at the scanned model, only to find that the scan is horrible! Why post a beautiful picture when the model is so bad?

        1. I’ve looked at raw models from Catch. Without the textures, they’re barely recognizable! (and they’re almost always huge, because they capture a bunch of useless “scenery” from around the model, which nobody bothers to chop off). “123D Caught” models should be banned from Thingiverse (and every other online repository) until Autodesk works out the bugs.

          1. personally I disagree, 123d catch is a really great tool for creating basic meshes, and when done properly IE not a cellphone camera, yes the meshes do need cleaning up. but its a very good starting point
            http://katzmattcreates.tumblr.com/post/45257485074/katzmatt-start-to-finish-casey-style-ps here is one of my good scans from raw, to digital, to mesh to print
            http://katzmattcreates.tumblr.com/post/45179512802/here-kids-let-me-tell-you-a-little-story-about-3d
            and here is a example of a good and bad scan just from background

  1. You can improve the laser line by getting a narrow slit laser cut in a thin piece of metal. Figure out a way to mount that over the laser level’s beam spreader.

    What you do not want is too narrow of a slit, which will cause diffraction of the beam.

    Shining a laser through two linear polarization filters causes some neat effects that change depending on the rotation of the filters relative to each other.

    What doesn’t do a thing to a laser other than slightly dimming it are the lenses in RealD 3D movie glasses. Those use right and left circularly polarized filters. What I found extra odd is that after passing through a left lens, which should have altered the polarization to match, the polarized beam would also pass right through a right lens on a second pair of the glasses. I was hoping for nifty diffraction patterns but got nothing.

    Use linear polarizers and white light, after passing through one, the light will not pass through a second filter at 90 degrees to the first. That’s one of the things that make LCD panels work.

  2. Structured light scanning is one of the older, more venerable techniques.

    Fairly sure the Stanford bunny and David were done using a Cartesian one of these.

    To increase the information per capture, try adding a green and red laser to the projection. Run a batch image filter on the images rgb channels prior to extracting features. If you add a reference plane behind in x and y orientations, you can also move the source (such as rotating it) and two stepper motors become a very thorough scanning platform

    Another version of this uses a projector and a sine-wave RGB pattern, then wavelet math to reconstruct depth..

  3. The more expensive scanners have the user move the scanning head, but they’re on the very expensive CMM arms. But a frame that moves a head up and down might do a lot of good. The Matter & Form machines do that on a linear track next to a turn table.

    The next thing after that is cleaning up the point cloud of spurious data.

Leave a Reply to GalaneCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.