OpenAstroTracker Turns Your DSLR Into An Astronomy Instrument

If you want to take beautiful night sky pictures with your DSLR and you live between 15 degrees and 55 degrees north latitude you might want to check out OpenAstroTracker. If you have a 3D printer it will probably take about 60 hours of printing, but you’ll wind up with a pretty impressive setup for your camera. There’s an Arduino managing the tracking and also providing a “go to” capability.

The design is over on Thingiverse and you can find code on GitHub. There’s also a Reddit dedicated to the project. The tracker touts its ability to handle long or heavy lenses and to target 180 degrees in every direction.

Some of the parts you must print are specific to your latitude to within 5 degrees, so if you live at latitude 43 degrees, you could pick the 40-degree versions of the parts. So far though, you must be in the Northern hemisphere between 15 and 55 degrees.

What kind of images can you expect? The site says this image of Andromeda was taken over several nights using a Soligor 210mm f/4 lens with ISO 800 film.

Not bad at all! Certainly not the view from our $25 department store telescope.

If you’d rather skip the Arduino, try a cheap clock movement. Or you can replace the clock and the Arduino with yourself.

30 thoughts on “OpenAstroTracker Turns Your DSLR Into An Astronomy Instrument

    1. Yes, you are right. To get away with two axis, you have to precisely align one of the axis with the rotation axis of the earth. This limits its use to a particular lattitude. It’s called a parallact mount. Without such an alignment, you could still track an object but the field of view would rotate as you track the sky over a longer period of time. This could be solved by a third axis to rotate the camera. But almost all astronomical devices avoid this to reduce complexity and increase accuracy.

        1. I think it means equatorial here, in a plane parallel to the equator. The original meaning of paralactical would seem to imply it corrected parallax error due to the atmosphere bending the light from stars/planets when within a few degrees of the horizon.

    1. Stacking software could remove these easily enough. Stars and planets hardly move in comparison so anything streaking across the frame can be removed.

      Couple that with the ongoing work to cut the reflectivity of the satellites combined with the cycling over time of satellites as they age the problem is going away before it really gets started.

        1. No, you actually take relatively short (for astrophotography) exposures (~5-10 minutes for typical focal lengths) and stack them. The reason is NO tracker is 100% perfect, and given the high pixel density of sensors today small errors creep in pretty quickly.

      1. And it does. It can reject satellites – which were common before Musk – and badly guided frames. 60 second exposures are common as well. (Depending on the subject, you have to keep exposure short enough to not fill detector wells with electrons.) Totaling as much time as you need and over several nights if needed and with different filters. The results are spectacular. People with 100mm/4″ refactors and a DSLR routinely produce images that rival old Palomar 200″ results. They can’t do the astrometry of individual stars as finely in a single view, but deep sky stuff is awesome. Check a Deep Sky Stacker gallery.

    2. Maybe there is an idea, track all space based objects in real time, and drop all frames with known space objects fly through the frame.
      Although with the number objects up there now, it would probably be easier just to take a single frame, when there is nothing overhead.

      You might pick up a few spy satellites by accident, but not much you can do about that.

  1. I would feel a lot better about it if it used the tripod mount to hold the camera. Especially since the focus, zoom, and aperture control rings are all on the lens the camera might be able to rotate.

  2. Some corrections to what has been said. The length of an individual astronomical exposure is mostly a function of lens speed, sky darkness and any filtration used. For unfiltered, visual (400-700 nm) shots, at F/2 under very dark skies, there is no reason to go longer than 3 minutes. 6 minutes at F/2.8, 12 minutes at F/4 etc. Much shorter for light polluted skies, and decent results can be obtained in 1/3 these times. You must correct for the earths rotation in any exposure longer than a few seconds, perhaps 10 seconds with a 35 mm lens, scaled up or down inversely as a function of focal length. It is normal to take several exposures (sometimes dozens) and average them in the computer to greatly improve results. In this process, satellite trails, airplanes and cosmic ray hits can be averaged out, but every time this is done, data is lost. Once the sky is full of Musk pollution, each wide field image will be filled with dozens of satellite trails, maybe 50-100 for really wide field images. Averaging out all of these will greatly degrade the image, there is now way around this other than perhaps to triple the number of images used in the average. In fact, this level of night sky pollution will not be fully reversed even by such drastic measures. Musk is on the verge of destroying thee night sky for generations to come, all to line his pockets with even more cash. He will sell you a line about helping out the digitally disenfranchised, but it is just an cover for relentless greed.

    1. They way around it is for Musk to give launch space to high orbit for a bunch of small space telescopes of 18 to 24″ Planewave quality. And a way for any of us to book time on them.

  3. If this thing is driving a motor, you should be able to use it in the southern hemisphere too. If you use the south pole instead of North pole as reference, just switch the polarity on the motor and you should be good to go.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.