As it so happens, [Andrew Cooper] was just about the leave the summit of Mauna Kea (in Hawaii) when his radio instructed him otherwise: there was an issue. Upon returning, [Andrew] was met by a room of scientists and summit supervisors. “Yeah, this was not good, why are they all looking at me? Oh, h%#*!” The rotor wasn’t moving the telescope, and “no rotator equals no science data.” After being briefed on the problem, [Andrew] got to work. Was it a mechanical issue? No: manual mode worked quite fine, also indicating that the amplifiers and limit switches are functional as well.
Jumping from chip to chip, [Andrew] came across an odd voltage: 9.36V. In the CMOS [Andrew] was investigating, this voltage should have High (15V) or Low (0v) and nowhere in between. Judging by the 9.36V [Andrew] decided to replace the driving IC. One DS3632 later, nothing had changed. Well, maybe is one of the loads pulling the line low? With only two choices, [Andrew] eliminated that possibility quickly. Likely feeling as if he was running out of proverbial rope, [Andrew] remembered something important: “the DS3236 driving this circuit is an open collector output, it needs a pull-up to go high.”
Reviewing the schematic, [Andrew] identified the DS3236’s pull-up: an LED and its current limiting resistor. While the carbon composition resistor was “armageddon proof,” [Andrew] was suspicious of the LED. “Nick, can you get me a 5k resistor from the lab?” Hold the resistor on the pins of the chip and the amplifiers immediately enabled.
[Andrew] summarizes things quite well: “yes… One of the world’s largest telescopes, 370 tons of steel and glass, was brought to a halt because of a bad indicator LED”. It stopped things by doing nothing, or rather, by not turning on.
As exciting as Eclipse 2017 is going to be this Monday, for some folks it might appear a bit — underwhelming. Our star only occupies about half a degree of the sky, and looking at the partial phase with eclipse glasses might leave you yearning for a bigger image. If that’s you, you’ll need to build a sun funnel for super-sized eclipse fun.
[Grady] at Practical Engineering is not going to be lucky enough to be within the path of totality, but he is going to be watching the eclipse with a bunch of school kids. Rather than just outfitting his telescope with a filter and having the kids queue up for a quick peek, he built what amounts to a projection screen for the telescope’s eyepiece. It’s just a long funnel, and while [Grady] chose aluminum and rivets, almost any light, stiff material will do. He provides a formula for figuring out how long the funnel needs to be for your scope, along with plans for laying out the funnel. We have to take exception with his choice of screen material — it seems like the texture of the translucent shower curtain might interfere with the image a bit. But still, the results look pretty good in the video below.
While most people who make the trek to the path of totality for the Great American Eclipse next week will fix their gazes skyward as the heavenly spectacle unfolds, we suspect many will attempt to post a duck-face selfie with the eclipsed sun in the background. But at least one man will be feverishly tending to an experiment.
On a lonely hilltop in Wyoming, Dr. Don Bruns will be attempting to replicate a famous experiment. If he succeeds, not only will he have pulled off something that’s only been done twice before, he’ll provide yet more evidence that Einstein was right.
The Hackaday Prize is more than just giving tens of thousands of dollars to hardware hackers. It’s also about funding the next batch of Open Source hardware products. Alongside The Hackaday Prize — the contest where we’re funding hardware that will change the world, — we’re also giving away $30,000 to the project that will best become a product. It’s almost like we’re funding hardware startups here.
[Dessislav Gouzgounov] wanted to build a small piece of hardware — a GoTo for his telescope. This handheld controller would allow him to use software to align the telescope with whatever celestial body he’s checking out.
Many GoTos simply interface with a laptop, but [Dessislav] built a standalone system centered around an Arduino Due and 240×400 touch screen, with GPS, RTC, and Bluetooth under the hood. It works on both hemispheres and contains a database of 250 celestial objects, features different speeds for time-delayed tracking of celestial, lunar, and solar phenomena, and it can work with any stepper-equipped telescope.
We covered [Dessislav]’s previous version of the RDuinoScope, but he’s improved the project considerably with over 2,400 lines of code including a new menu system and added a star atlas showing the location of the sky at which the telescope is currently pointed, among other improvements. The project is open source and you can learn more about it on [Dessislav]’s project page or check out his code on GitHub.
New to astrophotography, [Jason Bowling] had heard that the Raspberry Pi’s camera module could be used as a low-cost entry into the hobby. Having a Raspberry Pi B+ and camera module on hand from an old project, he dove right in, detailing the process for any other newcomers.
Gingerly removing the camera’s lens, the module fit snugly into a 3D printed case — courtesy of a friend — and connected it to a separate case for the Pi. [Bowling] then mounted he camera directly on the telescope — a technique known as prime-focus photography, which treats the telescope like an oversized camera lens. A USB battery pack is perfect for powering the Pi for several hours.
When away from home, [Bowling] has set up his Pi to act as a wireless access point; this allows the Pi to send a preview to his phone or tablet to make adjustments before taking a picture. [Bowling] admits that the camera is not ideal, so a little post-processing is necessary to flesh out a quality picture, but you work with what you have. Continue reading “Budget Astrophotography With A Raspberry Pi”→
We’ve all enjoyed looking up at a clear night sky and marveled at the majesty of the stars. Some of us have even pointed telescopes at particular celestial objects to get a closer view. Anyone who’s ever looked at anything beyond Jupiter knows the hassle involved. It is most unfortunate that the planet we reside on happens to rotate about a fixed axis, which makes it somewhat difficult to keep a celestial object in the view of your scope.
It doesn’t take much to strap a few steppers and some silicon brains to a scope to counter the rotation of earth, and such systems have been available for decades. They are unfortunately quite expensive. So [Dessislav Gouzgounov] took matters into his own hands and developed the rDuinoScope – an open source telescope control system.
Based on the Arduino Due, the systems stores a database of 250 stellar objects. Combined with an RTC and GPS, the rDunioScope can locate and lock on to your favorite nebula and track it, allowing you to view it in peace. Be sure to grab the code and let us know when you have your own rDuinoScope set up!
Like any Moore’s Law-inspired race, the megapixel race in digital cameras in the late 1990s and into the 2000s was a harsh battleground for every manufacturer. With the development of the smartphone, it became a war on two fronts, with Samsung eventually cramming twenty megapixels into a handheld. Although no clear winner of consumer-grade cameras was ever announced (and Samsung ended up reducing their flagship phone’s cameras to sixteen megapixels for reasons we’ll discuss) it seems as though this race is over, fizzling out into a void where even marketing and advertising groups don’t readily venture. What happened?
A brief overview of Moore’s Law predicts that transistor density on a given computer chip should double about every two years. A digital camera’s sensor is remarkably similar, using the same silicon to form charge-coupled devices or CMOS sensors (the same CMOS technology used in some RAM and other digital logic technology) to detect photons that hit it. It’s not too far of a leap to realize how Moore’s Law would apply to the number of photo detectors on a digital camera’s image sensor. Like transistor density, however, there’s also a limit to how many photo detectors will fit in a given area before undesirable effects start to appear.
Image sensors have come a long way since video camera tubes. In the ’70s, the charge-coupled device (CCD) replaced the cathode ray tube as the dominant video capturing technology. A CCD works by arranging capacitors into an array and biasing them with a small voltage. When a photon hits one of the capacitors, it is converted into an electrical charge which can then be stored as digital information. While there are still specialty CCD sensors for some niche applications, most image sensors are now of the CMOS variety. CMOS uses photodiodes, rather than capacitors, along with a few other transistors for every pixel. CMOS sensors perform better than CCD sensors because each pixel has an amplifier which results in more accurate capturing of data. They are also faster, scale more readily, use fewer components in general, and use less power than a comparably sized CCD. Despite all of these advantages, however, there are still many limitations to modern sensors when more and more of them get packed onto a single piece of silicon.
While transistor density tends to be limited by quantum effects, image sensor density is limited by what is effectively a “noisy” picture. Noise can be introduced in an image as a result of thermal fluctuations within the material, so if the voltage threshold for a single pixel is so low that it falsely registers a photon when it shouldn’t, the image quality will be greatly reduced. This is more noticeable in CCD sensors (one effect is called “blooming“) but similar defects can happen in CMOS sensors as well. There are a few ways to solve these problems, though.
First, the voltage threshold can be raised so that random thermal fluctuations don’t rise above the threshold to trigger the pixels. In a DSLR, this typically means changing the ISO setting of a camera, where a lower ISO setting means more light is required to trigger a pixel, but that random fluctuations are less likely to happen. From a camera designer’s point-of-view, however, a higher voltage generally implies greater power consumption and some speed considerations, so there are some tradeoffs to make in this area.
Another reason that thermal fluctuations cause noise in image sensors is that the pixels themselves are so close together that they influence their neighbors. The answer here seems obvious: simply increase the area of the sensor, make the pixels of the sensor bigger, or both. This is a good solution if you have unlimited area, but in something like a cell phone this isn’t practical. This gets to the core of the reason that most modern cell phones seem to be practically limited somewhere in the sixteen-to-twenty megapixel range. If the pixels are made too small to increase megapixel count, the noise will start to ruin the images. If the pixels are too big, the picture will have a low resolution.
There are some non-technological ways of increasing megapixel count for an image as well. For example, a panoramic image will have a megapixel count much higher than that of the camera that took the picture simply because each part of the panorama has the full mexapixel count. It’s also possible to reduce noise in a single frame of any picture by using lenses that collect more light (lenses with a lower f-number) which allows the photographer to use a lower ISO setting to reduce the camera’s sensitivity.
Of course, if you have unlimited area you can make image sensors of virtually any size. There are some extremely large, expensive cameras called gigapixel cameras that can take pictures of unimaginable detail. Their size and cost is a limiting factor for consumer devices, though, and as such are generally used for specialty purposes only. The largest image sensor ever built has a surface of almost five square meters and is the size of a car. The camera will be put to use in 2019 in the Large Synoptic Survey Telescope in South America where it will capture images of the night sky with its 8.4 meter primary mirror. If this was part of the megapixel race in consumer goods, it would certainly be the winner.
With all of this being said, it becomes obvious that there are many more considerations in a digital camera than just the megapixel count. With so many facets of a camera such as physical sensor size, lenses, camera settings, post-processing capabilities, filters, etc., the megapixel number was essentially an easy way for marketers to advertise the claimed superiority of their products until the practical limits of image sensors was reached. Beyond a certain limit, more megapixels doesn’t automatically translate into a better picture. As already mentioned, however, the megapixel count can be important, but there are so many ways to make up for a lower megapixel count if you have to. For example, images with high dynamic range are becoming the norm even in cell phones, which also helps eliminate the need for a flash. Whatever you decide, though, if you want to start taking great pictures don’t worry about specs; just go out and take some photographs!
(Title image: VISTA gigapixel mosaic of the central parts of the Milky Way, produced by European Southern Observatory (ESO) and released under Creative Commons Attribution 4.0 International License. This is a scaled version of the original 108,500 x 81,500, 9-gigapixel image.)