Digitize Your Room With LIDAR

What’s the best way to image a room? A picture? Hah — don’t be so old-fashioned! You want a LIDAR rig to scan the space and reconstruct it as a 3D point map in your computer.

Hot on the heels of [Saulius Lukse]’s scanning thermometer, he’s replaced the thermal camera on their pan/tilt setup with a time-of-flight (TOF) camera — a Garmin LIDAR — capable of 500 samples per second and end up scanning their room in a mere fifteen minutes. Position data is combined with the ranging information to produce a point cloud using Python. Open that file in a 3D manipulation program and you’ll be treated to a sight like this:

Continue reading “Digitize Your Room With LIDAR”

The Megapixel Race And Its Clear Winner

Like any Moore’s Law-inspired race, the megapixel race in digital cameras in the late 1990s and into the 2000s was a harsh battleground for every manufacturer. With the development of the smartphone, it became a war on two fronts, with Samsung eventually cramming twenty megapixels into a handheld. Although no clear winner of consumer-grade cameras was ever announced (and Samsung ended up reducing their flagship phone’s cameras to sixteen megapixels for reasons we’ll discuss) it seems as though this race is over, fizzling out into a void where even marketing and advertising groups don’t readily venture. What happened?

The Technology

A brief overview of Moore’s Law predicts that transistor density on a given computer chip should double about every two years. A digital camera’s sensor is remarkably similar, using the same silicon to form charge-coupled devices or CMOS sensors (the same CMOS technology used in some RAM and other digital logic technology) to detect photons that hit it. It’s not too far of a leap to realize how Moore’s Law would apply to the number of photo detectors on a digital camera’s image sensor. Like transistor density, however, there’s also a limit to how many photo detectors will fit in a given area before undesirable effects start to appear.

cmos_image_sensor_mechanism_illustration
CMOS Image Sensor Mechanism Illustration, By User:たまなるたみ – drawing created myself, GPL, https://commons.wikimedia.org/w/index.php?curid=371238. Note that each pixel has its own amplifier.

Image sensors have come a long way since video camera tubes. In the ’70s, the charge-coupled device (CCD) replaced the cathode ray tube as the dominant video capturing technology. A CCD works by arranging capacitors into an array and biasing them with a small voltage. When a photon hits one of the capacitors, it is converted into an electrical charge which can then be stored as digital information. While there are still specialty CCD sensors for some niche applications, most image sensors are now of the CMOS variety. CMOS uses photodiodes, rather than capacitors, along with a few other transistors for every pixel. CMOS sensors perform better than CCD sensors because each pixel has an amplifier which results in more accurate capturing of data. They are also faster, scale more readily, use fewer components in general, and use less power than a comparably sized CCD. Despite all of these advantages, however, there are still many limitations to modern sensors when more and more of them get packed onto a single piece of silicon.

While transistor density tends to be limited by quantum effects, image sensor density is limited by what is effectively a “noisy” picture. Noise can be introduced in an image as a result of thermal fluctuations within the material, so if the voltage threshold for a single pixel is so low that it falsely registers a photon when it shouldn’t, the image quality will be greatly reduced. This is more noticeable in CCD sensors (one effect is called “blooming“) but similar defects can happen in CMOS sensors as well. There are a few ways to solve these problems, though.

cockfield-minco
A sunrise picture taken with an entry-level DSLR at 1600 ISO. At this sensitivity, noise in the clouds can be seen in the form of random fluctuations of some pixels. This effect would be mitigated by a camera with a larger sensor, a lower sensor sensitivity with a longer shutter speed (which would blur the turbine blades) or a scene with more light. Photo  © 2016 by Bryan Cockfield

 

First, the voltage threshold can be raised so that random thermal fluctuations don’t rise above the threshold to trigger the pixels. In a DSLR, this typically means changing the ISO setting of a camera, where a lower ISO setting means more light is required to trigger a pixel, but that random fluctuations are less likely to happen. From a camera designer’s point-of-view, however, a higher voltage generally implies greater power consumption and some speed considerations, so there are some tradeoffs to make in this area.

Another reason that thermal fluctuations cause noise in image sensors is that the pixels themselves are so close together that they influence their neighbors. The answer here seems obvious: simply increase the area of the sensor, make the pixels of the sensor bigger, or both. This is a good solution if you have unlimited area, but in something like a cell phone this isn’t practical. This gets to the core of the reason that most modern cell phones seem to be practically limited somewhere in the sixteen-to-twenty megapixel range. If the pixels are made too small to increase megapixel count, the noise will start to ruin the images. If the pixels are too big, the picture will have a low resolution.

There are some non-technological ways of increasing megapixel count for an image as well. For example, a panoramic image will have a megapixel count much higher than that of the camera that took the picture simply because each part of the panorama has the full mexapixel count. It’s also possible to reduce noise in a single frame of any picture by using lenses that collect more light (lenses with a lower f-number) which allows the photographer to use a lower ISO setting to reduce the camera’s sensitivity.

Gigapixels!

Of course, if you have unlimited area you can make image sensors of virtually any size. There are some extremely large, expensive cameras called gigapixel cameras that can take pictures of unimaginable detail. Their size and cost is a limiting factor for consumer devices, though, and as such are generally used for specialty purposes only. The largest image sensor ever built has a surface of almost five square meters and is the size of a car. The camera will be put to use in 2019 in the Large Synoptic Survey Telescope in South America where it will capture images of the night sky with its 8.4 meter primary mirror. If this was part of the megapixel race in consumer goods, it would certainly be the winner.

design_of_the_lsst_camera
LSST Image Sensor, By Todd Mason, Mason Productions Inc. / LSST Corporation – https://www.lsst.org/sites/default/files/photogallery/Camera_CU-full.jpg, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=52230238

With all of this being said, it becomes obvious that there are many more considerations in a digital camera than just the megapixel count. With so many facets of a camera such as physical sensor size, lenses, camera settings, post-processing capabilities, filters, etc., the megapixel number was essentially an easy way for marketers to advertise the claimed superiority of their products until the practical limits of image sensors was reached. Beyond a certain limit, more megapixels doesn’t automatically translate into a better picture. As already mentioned, however, the megapixel count can be important, but there are so many ways to make up for a lower megapixel count if you have to. For example, images with high dynamic range are becoming the norm even in cell phones, which also helps eliminate the need for a flash. Whatever you decide, though, if you want to start taking great pictures don’t worry about specs; just go out and take some photographs!

(Title image: VISTA gigapixel mosaic of the central parts of the Milky Way, produced by European Southern Observatory (ESO) and released under Creative Commons Attribution 4.0 International License. This is a scaled version of the original 108,500 x 81,500, 9-gigapixel image.)

Laser Sequencer Uses Arduino To Enable Super-Microscope!

pcb
[Philip]’s Laser control Arduino shield.

[Philip Nicovich] has been building laser sequencers over at the University of New South Wales. His platform is used to sequence laser excitation on his fluorescence microscopy systems. In [Philip]’s case, these systems are used for super-resolution microscopy, that is breaking the diffraction limit allowing the imaging of structures of only a few nanometers (1 millionth of a millimeter) in size.

Using an Arduino shield he designed in Eagle, [Philip] was able to build the system for less than half the cost of a commercial platform.

The control system is build around the simple Arduino shield shown to the right, which uses simple 74 series logic to send TTL control signals to the laser diodes used in his rig. The Arduino runs code which allows laser firing sequences to be programmed and executed.

[Philip] also provides scripts which show how the Arduino can be interfaced with the open source micro manager control software.

NicoLase1500EnclosureRender

As well as the schematics [Philip] has provided STEP files and drawings for the enclosure and mounts used in the system and a detailed BOM.

More useful than all this perhaps is the comprehensive write-up he provides. This describes the motivation for decisions such as the use of aluminum over steel due to its ability to transfer heat more effectively, and not to use thermal paste due to out-gassing.

While I can almost hear the cries of “not a hack”, the growing use of open source platforms and tool in academia fills us with joy. Thanks for the write-up [Philip] we look forward to hearing more about your laser systems in the future!

Zero Parts-Count Temperature Sensor

Quick: What’s the forward voltage drop on a conducting diode? If you answered something like 0.6 to 0.7 V, you get a passing grade, but you’re going to have to read on. If you answered V_F = \frac{T-T_0}{k} where T0 and k are device-specific constants to be determined experimentally, you get a gold Jolly Wrencher.

vsd%2C+n-01[Jakub] earned his Wrencher, and then some. Because not only did he use the above equation to make a temperature sensor, he did so with a diode that you might have even forgotten that you have on hand — the one inside the silicon of a MOSFET — the intrinsic body diode.

[Jakub]’s main project is an Arduino-controlled electronic load that he calls the MightWatt, and a beefy power MOSFET is used as the variable resistance element. When it’s pulling 20 or 30 A, it gets hot. How hot exactly is hard to measure without a temperature sensor, and the best possible temperature sensor would be one that was built into the MOSFET’s die itself.

There’s a bunch of detail in his write-up about how he switches the load in and out to measure the forward drop, and how he calibrates the whole thing. It’s technical, but give it a read, it’s good stuff. This is a great trick to have up your sleeve.

And if you’re in the mood for more stupid diode tricks, we recommend using them as solar cells or just stringing a bunch of them together to make a thermal camera.

Hackaday Reviews: Flir One Android

The Flir One thermal camera caused quite a stir when it was launched back in 2014. Both the Flir One and its prime competitor Seek Thermal represented the first “cheap” thermal cameras available to the public. At the heart of the Flir One was the Lepton module, which could be purchased directly from Flir Systems, but only in quantity. [Mike Harrison] jumped on board early, cutting into his Flir One and reverse engineering the Lepton module within, including the SPI data required to talk to it. He even managed to create the world’s smallest thermal imager using a the TFT screen from an Ipod Nano.

flircamA few things have changed since then. You can buy Lepton modules in single quantity at DigiKey now. Flir also introduced a second generation of the Flir One. This device contains an updated version of the Lepton. The new version has a resolution of 160 x 120 pixels, doubled from the original module. There are two flavors: The iOS version with a lightning port, and an Android version with a micro USB connector. I’m an Android user myself, so this review focuses on the Android edition.

The module itself is smaller than I expected. It comes with a snap-on case and a lanyard. While you’ll look a bit like a dork wearing the lanyard, it does come in handy to keep the imager from getting lost or dropped. The Flir One has an internal battery, which of course needs to be topped off before it can be used. Mine charged up in about half an hour.

Continue reading “Hackaday Reviews: Flir One Android”

If We Were All Astronomers There’d Be No More War

We recently reported on the amateur scientific work of Forrest Mims. Forrest is somewhat unique in being an amateur scientist who has consistently published his work in leading scientific journals. One area of scientific investigation has however attracted amateur scientific contributions of the highest quality almost since its inception, amateur astronomy.

willhay
Will Hay – Amateur Astronomer

You’ve likely heard of amateur astronomers like David Levy co-discoverer of the Comet Shoemaker–Levy 9 comet, and citizen science projects like galaxy zoo. But the history of amateur astronomy goes back far further than this, in fact as far back as 1781 William Herschel discovered the planet Uranus while employed as a Musician. Another entertainer of sorts, 1930s British comic actor, Will Hay, also made significant contributions discovering a “Great White Spot” on Saturn in-between films roles. Will was an avid amateur astronomer who regularly published his observations.

His belief that astronomy allows us to see humanity’s place in the universe in its true proportion led him to claim “If we were all astronomers there’d be no more war”.

While Will recorded his observations, hand drawn, in a log book. Modern astronomers digitally image the night sky. Digital cameras are of course optimized around the human visual system (as we recently discussed) making them less than ideal for astrophotography. Hackers have therefore made a number of innovations, one of the more audacious being the removal of the Bayer filter:

Continue reading “If We Were All Astronomers There’d Be No More War”

$40 Lens Hack Gives Your FLIR Higher Clarity

[Josh Oster-Morris’s] FLIR camera can see a bit more clearly now that he’s hacked it to have its own makeshift “macro” mode. You may remember [Josh] from his power distribution Motobrain project. He’s still improving the Motobrain, and he wanted to better understand the thermal characteristics of the high current draws (upwards of 100amps!)

After reading that the FLIR 4  could be hacked into a better version, [Josh] immediately purchased his own. The FLIR is, however, limited at close-range imaging, because the resolution of the FLIR’s microbolometer is relatively low.  He had fortunately decided to stay tuned in to [Mike’s] YouTube channel and saw his follow-up video a few days later on refocusing the FLIR camera with an external lens. [Josh] hit up Amazon for a Gallium Arsenide lens normally used for CO2 lasers, and found one for around $40. He then mounted this lens into a simple paper frame held together by tape and staples, and fitted it onto the FLIR.

After you’ve checked out [Josh’s] blog for more examples of how astoundingly clear the images become, check out [Mike’s] video detailing the hack below.

Continue reading “$40 Lens Hack Gives Your FLIR Higher Clarity”