In the grand scheme of things, it wasn’t all that long ago that the entire body of knowledge of our solar system was built solely with Earth-based observations. Turning first their naked eyes to the heavens, and then a succession of increasingly complex and sensitive optical and radio telescopes gathering light from all across the spectrum, our astronomically curious forbears did a commendable job working out the broad strokes of what’s going on in the neighborhood.
But there’s only so much information that can be gathered by instruments operating at the bottom of a roiling ocean of air, so when the opportunity to send instruments to our planetary neighbors began to be possible some 60 years ago, scientists started planning how to accomplish it. What resulted was the Mariner program, a series of interplanetary probes launched between 1962 and 1973 that performed flyby missions of the inner planets.
The list of accomplishments of the Mariner program is long indeed, and the number of firsts achieved by its ten spacecraft is impressive. But it is Mariner 4, the first flyby mission of Mars, which set the stage for a lot of the science being done on and around Mars today, and the first mission where NASA wisely took a “pics or it didn’t happen” approach to planetary science. It was the first time a TV camera had traveled to another world, and it was anything but a sure bet that it would pay dividends.
What happens when you mix over 23,000 coffee stirrers and a Raspberry Pi camera together? Probably nothing except for a mess, unless you very specifically pack the plastic straws and orient the camera just right. In that case, you get this very cool lenless digital straw camera that takes artfully ghostly images.
Actually, lensless is a bit of a reach for [Adrian Hanft]’s creation. While the camera he’s using to grab the image has a lens, the objective, for lack of a better term, is just a tightly packed bundle of straws. We’ve seen this approach to photography before, but there the camera used film placed at the back of the straw bundles to capture the pixelated image.
Here, a ground glass screen stands in for the film; a long lightproof box behind that provide a place to mount a camera to capture the images. Cleverly, [Adrian] built the camera mount from Lego, allowing cameras and lenses to be quickly swapped out. A Nintendo gamepad controller talks to custom software running on a Raspberry Pi and allows the photographer to control exposure and scroll through pictures using a smartphone as a display. There’s a short build video below, for those who can’t get enough of straw-packing techniques.
As with the film version of this camera, we just love the look of the photographs that come from this — the texture of the straw honeycomb and the defocused subject make for a striking effect.
Have you ever noticed that people in old photographs looks a bit weird? Deep wrinkles, sunken cheeks, and exaggerated blemishes are commonplace in photos taken up to the early 20th century. Surely not everybody looked like this, right? Maybe it was an odd makeup trend — was it just a fashionable look back then?
Not quite — it turns out that the culprit here is the film itself. The earliest glass-plate emulsions used in photography were only sensitive to the highest-frequency light, that which fell in the blue to ultraviolet range. Perhaps unsurprisingly, when combined with the fact that humans have red blood, this posed a real problem. While some of the historical figures we see in old photos may have benefited from an improved skincare regimen, the primary source of their haunting visage was that the photographic techniques available at the time were simply incapable of capturing skin properly. This lead to the sharp creases and dark lips we’re so used to seeing.
Of course, primitive film isn’t the only thing separating antique photos from the 42 megapixel behemoths that your camera can take nowadays. Film processing steps had the potential to introduce dust and other blemishes to the image, and over time the prints can fade and age in a variety of ways that depend upon the chemicals they were processed in. When rolled together, all of these factors make it difficult to paint an accurate portrait of some of history’s famous faces. Before you start to worry that you’ll never know just what Abraham Lincoln looked like, you might consider taking a stab at Time-Travel Rephotography.
Amazingly, Time-Travel Rephotography is a technique that actually lives up to how cool its name is. It uses a neural network (specifically, the StyleGAN2 framework) to take an old photo and project it into the space of high-res modern photos the network was trained on. This allows it to perform colorization, skin correction, upscaling, and various noise reduction and filtering operations in a single step which outputs remarkable results. Make sure you check out the project’s website to see some of the outputs at full-resolution.
We’ve seen AI upscaling before, but this project takes it to the next level by completely restoring antique photographs. We’re left wondering what techniques will be available 100 years from now to restore JPEGs stored way back in 2021, bringing them up to “modern” viewing standards.
Starting projects is easy. It’s the finishing part that many of us have trouble with. We can hardly imagine completing a project after more than a decade, but seeing the breathtaking results of [J-P Metsavainio]’s gigapixel composite image of our galaxy might just make us reconsider. The photograph, which we highly suggest you go check out in its full glory, has been in progress since 2009, features 1250 total hours of exposure time, and spans across 125 degrees of sky. It is simply spectacular.
Of course, it wasn’t an absolutely continuous effort to make this one image over those twelve years. Part of the reason for the extended time span is many frames of the mosaic were shot, processed, and released as their own individual pieces; each of the many astronomical features impressive in its own right. But, over the years, he’s filled in the gaps between and has been able to release a more and more complete picture of our galactic home.
A project this long, somewhat predictably, eventually outlives the technology used to create it. Up until 2014, [Metsavainio]’s setup included a Meade 12-inch telescope and some modified Canon optics. Since then, he’s used a dedicated equatorial mount, astrocamera, and a Tokina lens (again, modified) with an 11-inch Celestron for longer focal lengths. He processes the frames in Photoshop, accounting for small exposure and color differences and aligning the images based on background stars. He’s had plenty of time to get his process down, though, so the necessary tweaking is relatively minor.
Whether you’re live streaming builds or just want to take your project photography to the next level, you can’t beat an overhead camera setup. Unfortunately, they tend to be cumbersome and more often than not quite pricey. Looking for an affordable solution that could easily be moved out of the way when not in use, [Jay Doscher] had the clever idea of adapting a common VESA monitor arm to give his camera a bird’s eye view of the action.
If you think about it, one of these monitor arms is a nearly perfect base for a camera rig. They’re easily mounted to a desk or work bench, can be quickly repositioned by design, and perhaps best of all, you don’t have to spend a lot of money to get a decent one. A camera is also a far lighter and less awkward payload than the arm was designed to hold, so you don’t have to worry about it potentially dropping your expensive gear. Or cheap webcam, as the case may be.
All [Jay] had to do was come up with a way to securely mount his Sony A7R3 on the end of one. While there’s certainly a few ways you could solve this particular problem, he went the extruded plastic route and 3D printed a beefy adapter plate with the standard VESA bolt pattern. His Smallrig camera cage attaches to the plate, and thanks to a pair of press-fit bubble levels from McMaster Carr, he’s able to get everything lined up properly over the bench.
It’s a job that’s ideally suited for the average microcontroller. In this case, [Alex] chose the venerable Arduino Uno. Paired with a bunch of buttons and a 16 x 2 character LCD, it has a simple-to-navigate interface for dialling in a shot. The trick to splash and droplet photography is to first open a valve to release a droplet, and then fire a flash a set time after to capture the droplet in flight, after it’s hit the surface of the liquid. [Alex]’s design uses a MOSFET to trigger the water valve, and optoisolators to safely trigger the flash and camera.
In today’s fast-paced world of social media, if you want your photos to grab attention, you’ve got to have an edge. Whether it’s a deft touch in Photoshop or an amazing lens, it’s important to stand apart. Another great way is to experiment with lighting and color. To do just that, [Andrei] built a pocket RGB photo light for the home studio.
This is a project that any experienced maker should be able to whip up in a weekend. Not that there’s anything wrong with that, of course. The basic enclosure is 3D printed and readily reproducible on any FDM printer. Lighting is provided via the venerable WS2812B LED, 68 of them, to be exact. Finally there’s an ESP8266 running WLED, a webserver for the platform that’s dedicated to controlling LED strips. This makes it easy to tweak the LEDs with your smartphone.
Thanks to the WS2812Bs LEDs, a full range of RGB colors are available for [Andrei] to experiment with. He’s done a great job showing off the light with a few choice cat pics that serve to show its capabilities. While we wouldn’t expect to use such a device for clean white lighting in a serious photographic sense, it’s a perfect tool for art photography.