Have you ever noticed that people in old photographs looks a bit weird? Deep wrinkles, sunken cheeks, and exaggerated blemishes are commonplace in photos taken up to the early 20th century. Surely not everybody looked like this, right? Maybe it was an odd makeup trend — was it just a fashionable look back then?
Not quite — it turns out that the culprit here is the film itself. The earliest glass-plate emulsions used in photography were only sensitive to the highest-frequency light, that which fell in the blue to ultraviolet range. Perhaps unsurprisingly, when combined with the fact that humans have red blood, this posed a real problem. While some of the historical figures we see in old photos may have benefited from an improved skincare regimen, the primary source of their haunting visage was that the photographic techniques available at the time were simply incapable of capturing skin properly. This lead to the sharp creases and dark lips we’re so used to seeing.
Of course, primitive film isn’t the only thing separating antique photos from the 42 megapixel behemoths that your camera can take nowadays. Film processing steps had the potential to introduce dust and other blemishes to the image, and over time the prints can fade and age in a variety of ways that depend upon the chemicals they were processed in. When rolled together, all of these factors make it difficult to paint an accurate portrait of some of history’s famous faces. Before you start to worry that you’ll never know just what Abraham Lincoln looked like, you might consider taking a stab at Time-Travel Rephotography.
Amazingly, Time-Travel Rephotography is a technique that actually lives up to how cool its name is. It uses a neural network (specifically, the StyleGAN2 framework) to take an old photo and project it into the space of high-res modern photos the network was trained on. This allows it to perform colorization, skin correction, upscaling, and various noise reduction and filtering operations in a single step which outputs remarkable results. Make sure you check out the project’s website to see some of the outputs at full-resolution.
We’ve seen AI upscaling before, but this project takes it to the next level by completely restoring antique photographs. We’re left wondering what techniques will be available 100 years from now to restore JPEGs stored way back in 2021, bringing them up to “modern” viewing standards.
Thanks to [Gus] for the tip!
Continue reading “Imaging The Past With Time-Travel Rephotography”
Join us on Wednesday, January 20th at noon Pacific for the Movie Magic Hack Chat with Alan McFarland!
If they were magically transported ahead in time, the moviegoers of the past would likely not know what to make of our modern CGI-driven epics, with physically impossible feats performed in landscapes that never existed. But for as computationally complex as movies have become, it’s the rare film that doesn’t still need at least some old-school movie magic, like hand props, physical models, and other practical effects.
To make their vision come to life, especially in science fiction films, filmmakers turn to artists who specialize in practical effects. We’ve all seen their work, which in many cases involves turning ordinary household objects into yet-to-be-invented technology, or creating scale models of spaceships and alien landscapes. But to really sell these effects, adding a dash of electronics can really make the difference.
Enter Alan McFarland, an electronics designer and engineer for the film industry. With a background in cinematography, electronics, and embedded systems, he has been able to produce effects in movies we’ve all seen. He designed electroluminescent wearables for Tron: Legacy, built the lighting system for the miniature Fhloston Paradise in The Fifth Element, and worked on the Borg costumes for Star Trek: First Contact. He has tons of experience making the imaginary look real, and he’ll join us on the Hack Chat to discuss the tricks he keeps in his practical effects toolkit to make movie magic.
Our Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, January 20 at 12:00 PM Pacific time. If time zones have you tied up, we have a handy time zone converter.
Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.
Continue reading “Movie Magic Hack Chat”
Like many of us, [Emily] found herself on COVID-19 lockdown over the summer. To make the most of her time in isolation, she put together an optical audio decoder for old 16 mm film, built using modern components and a bit of 3D printing.
It all started with a broken 16 mm projector that [Emily] got from a friend. After repairing and testing the projector with a roll of film bought at a flea market, she discovered that the film contained an audio track that her projector couldn’t play. The audio track is encoded as a translucent strip with varying width, and when a mask with a narrow slit is placed over the top it modulates the amount of light that can pass through to a light sensor connected to speakers via an amplifier.
[Emily] used a pair of razor blades mounted to a 3D printed bracket to create the mask, and a TI OPT101 light sensor together with a light source to decode the optical signal. She tried to use a photoresistor and a discrete photodiode, but neither had the required sensitivity. She built a frame with adjustable positions for an idler pulley and the optical reader unit, an electronics box on one end for the electronic components, and another pulley attached to a stepper motor to cycle a short loop of the film.
Most of the projects we see involving film these days are for creating digital copies. You can digitize your old 35 mm photo film using a Raspberry Pi, some Lego pieces, and a DSLR camera, or do the same for 8 mm film with a 3D printed rig. Continue reading “Listening To Long Forgotten Voices: An Optical Audio Decoder For 16 Mm Film”
How much would you enjoy a movie that took months to finish? We suppose it would very much depend on the film; the current batch of films from the Star Wars franchise are quite long enough as they are, thanks very much. But a film like Casablanca or 2001: A Space Odyssey might be a very different experience when played on this ultra-slow-motion e-paper movie player.
The idea of displaying a single frame of a movie up for hours rather than milliseconds has captivated [Tom Whitwell] since he saw [Bryan Boyer]’s take on the concept. The hardware [Tom] used is similar: a Raspberry Pi, an SD card hat with a 64 GB card for the movies, and a Waveshare e-paper display, all of which fits nicely in an IKEA picture frame.
[Tom]’s software is a bit different, though; a Python program uses FFmpeg to fetch and dither frames from a movie at a configurable rate, to customize the viewing experience a little more than the original. Showing one frame every two minutes and then skipping four frames, it has taken him more than two months to watch Psycho. He reports that the shower scene was over in a day and a half — almost as much time as it took to film — while the scene showing [Marion Crane] driving through the desert took weeks to finish. We always wondered why [Hitch] spent so much time on that scene.
With the proper films loaded, we can see this being an interesting way to really study the structure and flow of a good film. It’s also a good way to cut your teeth on e-paper displays, which we’ve seen pop up in everything from weather stations to Linux terminals.
We take orbital imagery for granted these days, but there was a time that it was high technology and highly secretive. [Scott Manley] has a good overview of the CIA’s Corona spy satellites, along with declassified images from the early days of the program.
It seems strange today, but the spy images needed high resolution and the only practical technology at the time was film. The satellite held a whopping 3,000 feet of film and, once shot, a capsule or bucket would return to Earth for retrieval and development. They didn’t make it to land — or at least they weren’t supposed to. The CIA didn’t want opponents sweeping up the film so an airplane was supposed to snag the bucket as it descended on a parachute, a topic covered in [Tom Nardi’s] article about the history of catching stuff as it falls from space.
The early cameras could see detail down to about 40 feet. By the end of the program in the 1970s, improved cameras could see down to 3 feet or less. Later satellites had a 3D-capable camera and multiple return buckets. The satellites were — officially — a program to expose biological samples to the space environment and return them for analysis. The Discover program was pure cover and the whole thing was declassified in 1992.
Of course, film from airplanes also had a role. Some spy satellites tried to scan film and send the data back, but that saw more use on lunar missions where returning a capsule to Earth was a lot more difficult.
Continue reading “The CIA’s Corona Project Was About Satellites, Not A Virus”
If you spent your youth watching James Bond or similar movies on rainy Saturday afternoons, then you may be familiar with a microdot as a top-secret piece of spy equipment, usually revealed as having been found attached to a seemingly innocuous possession of one of the bad guy’s henchmen, which when blown up on the screen delivers the cryptic yet vital clue to the location of the Evil Lair. Not something you give much thought in 2020 you might think, but that’s reckoning without [Sister HxA], who has worked out how to make them herself and detailed the process in a Twitter thread.
A microdot is a tiny scrap of photographic film, containing the image of some secret document or other, the idea being that it is small enough to conceal on something else. The example she gives is hiding it underneath a postage stamp. Because of their origins in clandestine work there is frustratingly little info on how to produce them, but she found a set of British instructions. Photographing a sheet such that its image occupies a small portion of her negative she makes a postage-stamp-sized one, and with care photographing that she manages to produce another of only a few millimetres in size. The smaller one isn’t very legible, but it’s still a fascinating process.
While we’re shopping at Q branch, how about an air-gun pen worthy of James Bond?
For the vast majority of readers, the act of taking a photograph will mean reaching for a mobile phone, or for a subset of you picking up a digital camera. A very small number of you will still use chemical film for its versatility and resolution, and we’re guessing that more would join those ranks if some of the cost barriers to doing so could be reduced.
It would be near-impossible to reduce the cost of a chemical photograph to the infinitely repeatable click of a digital camera shutter, but at least if the cost of a darkroom is intimidating then [Sroyon Mukherjee] has an interesting post over at 35mmc about how a darkroom for black-and-white printing from negatives can be equipped for less than £100 ($123). It’s a fascinating read even if your photography remains firmly in the digital, because along the way it explains some of the mysteries of the process. Few people had this type of equipment at home even in the days when most of us took our films to the drugstore, so as time passes this knowledge is concentrated among an ever narrower group.
The guide is full of useful hacks. Finding a second-hand enlarger takes an element of patience, but once it has been secured there are a variety of other essential items. The red safe light can be as simple as a mobile phone flashlight with a red filter, but we learn the trick of exposing a sheet of photographic paper with a coin laid on it to check that no white light is sneaking in. One of the main points of the piece is that there is no need for a special room to make a darkroom, and we take a tour of a few photographers’ set-ups in hallways, bathrooms, and basements.
So if you spot an unloved enlarger just waiting for a hacker to pass by, this might inspire you to do something with it. He doesn’t cover the development process, but if you throw caution to the winds you could always try coffee and vitamin C.
[via Hacker News]