Ancient Greek astronomer Hipparchus worked to accurately catalog and record the coordinates of celestial objects. But while Hipparchus’ Star Catalogue is known to have existed, the document itself is lost to history. Even so, new evidence has come to light thanks to patient work and multispectral imaging.
If you have any astronomer friends you’ll soon discover that theirs is a world of specialist high-quality optical equipment far ahead of the everyday tinkerer, and for mere mortals the dream of those amazing deep space images remains out of reach. It’s not completely impossible for the night sky to deliver impressive imagery on a budget though, as [David Schneider] shows us with a Raspberry Pi powered whole sky camera.
The project was born of seeing a meteor and idly wondering whether meteorite landing sites could be triangulated from a network of cameras, something he quickly discovered had already been done with some success. Along the way though he found the allsky camera project, and decided to build his own. This took the form of a Raspberry Pi 3 and a Pi HQ camera with a wide-angle lens mounted pointing skywards under an acrylic dome. It’s not the Hubble Space Telescope by any means, but the results are nevertheless impressive particularly in a timelapse. We wish there were less light pollution where we live so we could try it for ourselves.
[Christiaan Huygens] was a pretty decent mathematician and scientist by the standards of the 17th century. However, the telescopes he built were considered to be relatively poor in quality for the period. Now, as reported by Science News, we may know why. The well-known Huygens may have needed corrective glasses all along.
Huygens is known for, among other things, his contribution to astronomy. He discovered Titan, the largest moon of Saturn, and also studied the planet’s rings. He achieved this despite telescopes that were described at the time as fuzzy or blurrier than they otherwise should have been.
Huygens built two-lens telescopes, and would keep a table of which lenses to combine for different magnification levels. However, his calculations don’t align well with today’s understanding of optics. As it turns out, Huygens may have been nearsighted, which would account for why his telescopes were blurry. To his vision, they may indeed have been sharp, due to the nature of his own eyes. Supporting this are contemporary accounts that suggest Huygens father was nearsighted, with the condition perhaps running in the family. According to calculations by astronomer Alexander Pietrow, Huygens may have had 20/70 vision, in which he could only read at 20 feet what a person with “normal” vision could read from 70 feet away.
NASA’s Hubble Space Telescope is arguably the best known and most successful observatory in history, delivering unprecedented images that have tantalized the public and astronomers alike for more than 30 years. But even so, there’s nothing particularly special about Hubble. Ultimately it’s just a large optical telescope which has the benefit of being in space rather than on Earth’s surface. In fact, it’s long been believed that Hubble is not dissimilar from contemporary spy satellites operated by the National Reconnaissance Office — it’s just pointed in a different direction.
There are however some truly unique instruments in NASA’s observational arsenal, and though they might not have the name recognition of the Hubble or James Webb Space Telescopes, they still represent incredible feats of engineering. This is perhaps best exemplified by the Stratospheric Observatory for Infrared Astronomy (SOFIA), an airborne infrared telescope built into a retired airliner that is truly one-of-a-kind.
Unfortunately this unique aerial telescope also happens to be exceptionally expensive to operate; with an annual operating cost of approximately $85 million, it’s one of the agency’s most expensive ongoing astrophysics missions. After twelve years of observations, NASA and their partners at the German Aerospace Center have decided to end the SOFIA program after its current mission concludes in September.
With the telescope so close to making its final observations, it seems a good time to look back at this incredible program and why the US and German space centers decided it was time to put SOFIA back in the hangar.
On astronomical telescopes of even middling power, a small “finderscope” is often mounted in parallel to the main optics to assist in getting the larger instrument on target. The low magnification of the finderscope offers a far wider field of view than the primary telescope, which makes it much easier to find small objects in the sky. Even if your target is too small or faint to see in the finderscope, just being able to get your primary telescope pointed at the right celestial neighborhood is a huge help.
But [Dilshan Jayakody] still thought he could improve on things a bit. Instead of a small optical scope, his StarPointer is an electronic device that can determine the orientation of the telescope it’s mounted to. As the ADXL345 accelerometer and HMC5883L magnetometer inside the STM32F103C8 powered gadget detect motion, the angle data is sent to Stellarium — an open source planetarium program. Combined with a known latitude and longitude, this allows the software to show where the telescope is currently pointed in the night sky.
As demonstrated in the video after the break, this provides real-time feedback which is easy to understand even for the absolute beginner: all you need to do is slew the scope around until the object you want to look at it under the crosshairs. While we wouldn’t recommend looking at a bright computer screen right before trying to pick out dim objects in your telescope’s eyepiece, we can certainly see the appeal of this “virtual” finderscope.
Have you ever wished we could peek at all these exoplanets that have been recently discovered? We aren’t likely to visit anytime soon, but it would be possible to build a truly giant telescope that could take a look at something like that. At least according to [SciShow Space] in a recent video you can see below.
The idea put forth in a recent scientific paper is to deliberately create the conditions that naturally form gravitational lenses. If you recall, scientists have used these naturally-occurring lenses to image the oldest star ever observed. These natural super-telescopes have paid off many times, but you can’t pick what you want to look at. It is all a function of the distance to the star creating the lens and the direction a line between us points.
But what if you could create your own gravity lens? Granted, we probably aren’t going to do that in our garages. However, a recent paper talks about launching an optical detector that you could maneuver so that it was on a line that would pass through the object you want to see and our own sun. We clearly have the technology to do this. After all, we have several nice space telescopes, and several probes operating far away from the sun.
That is one of the biggest catches, though. This new telescope will need to be some 550 AU from the sun to get good results. For the record, the Earth is 1 AU (about 8 light minutes) out. Pluto — maybe not a planet anymore, but still a signpost on the way out of the solar system — is a scant 39 AU out. Voyager I, which has been racing away from the sun since 1977 is only about 156 AU out.
Because the craft would be so far out, it would be practically a one-shot mission. You also have to have something reliable enough to go the 17 years it would take with today’s technology to get in place. You also need a way to get the data back over that distance. All doable, but non-trivial.
The paper simulates what the Earth would look like using this technique from a nearby star. The images are shockingly good, especially after a bit of post-processing. Meanwhile, we may have to settle for more modest images. You might not see detail, but it is possible to find exoplanets with reasonably modest equipment.
It may be blurry and blotchy, but it’s ours. The first images of the supermassive black hole at the center of the Milky Way galaxy were revealed this week, and they caused quite a stir. You may recall the first images of the supermassive black hole at the center of the M87 galaxy from a couple of years ago: spectacular images that captured exactly what all the theories said a black hole should look like, or more precisely, what the accretion disk and event horizon should look like, since black holes themselves aren’t much to look at. That black hole, dubbed M87*, is over 55 million light-years away, but is so huge and so active that it was relatively easy to image. The black hole at the center of our own galaxy, Sagittarius A*, is comparatively tiny — its event horizon would fit inside the orbit of Mercury — a much closer at only 26,000 light-years or so. But, our black hole is much less active and obscured by dust, so imaging it was far more difficult. It’s a stunning technical achievement, and the images are certainly worth checking out.
Another one from the “Why didn’t I think of that?” files — contactless haptic feedback using the mouth is now a thing. This comes from the Future Interfaces Group at Carnegie-Mellon and is intended to provide an alternative to what ends up being about the only practical haptic device for VR and AR applications — vibrations from off-balance motors. Instead, this uses an array of ultrasonic transducers positioned on a VR visor and directed at the user’s mouth. By properly driving the array, pressure waves can be directed at the lips, teeth, and tongue of the wearer, providing feedback for in-world events. The mock game demonstrated in the video below is a little creepy — not sure how many people enjoyed the feeling of cobwebs brushing against the face or the splatter of spider guts in the mouth. Still, it’s a pretty cool idea, and we’d like to see how far it can go.