Pi Cam Replaces Pinhole And Film For Digital Solargraphy

Solargraph from a one-year exposure on film. Elekes Andor / CC BY-SA

Have you ever heard of solargraphy? The name tells you much of what you need to know, but the images created with a homemade pinhole camera and a piece of photographic film can be visually arresting, showing as they do the cumulative tracks of the sun’s daily journey across the sky over many months. But what if you don’t want to use film? Is solargraphy out of reach to the digital photographers of the world?

Not at all, thanks to this digital solargraphy setup. [volzo] searched for a way to make a digital camera perform like a film-based solargraphic camera, first thinking to take a series of images during the day and average them together. He found that this just averaged out the sun from the final image. His solution was to take a pair of photos at each timepoint — one correctly exposed to capture the scene, and one stopped way down to just capture the position of the sun as a pinprick of light. All the foreground images are averaged, while the stopped-down sun images are overlaid upon each other, producing the track of the sun across the sky. Add the two resulting images and you’ve got a solargraph.

To automate the process, [volzo] used a Raspberry Pi and a Pi-Cam fitted in a weatherproof 3D-printed box. A custom hat powers up the Pi every few minutes, which boots up and takes the two pictures. Sadly, the batteries only last for a couple of days, so those long six-month exposures aren’t possible yet. But [volzo] has made all the sources available, so feel free to build on his work. If you prefer to use a DSLR for the job, this Bluetooth intervalometer might help.

Machine Vision Keeps Track Of Grubby Hands

Can you remember everything you’ve touched in a given day? If you’re being honest, the answer is, “Probably not.” We humans are a tactile species, with an outsized proportion of both our motor and sensory nerves sent directly to our hands. We interact with the world through our hands, and unfortunately that may mean inadvertently spreading disease.

[Nick Bild] has a potential solution: a machine-vision system called Deep Clean, which monitors a scene and records anything in it that has been touched. [Nick]’s system uses Jetson Xavier and a stereo camera to detect depth in a scene; he built his camera from a pair of Raspberry Pi cams and a Pi 3B+, but other depth cameras like a Kinect could probably do the job. The idea is to watch the scene for human hands — OpenPose is the tool he chose for that job — and correlate their depth in the scene with the depth of objects. Touch a doorknob or a light switch, and a marker is left on the scene. The idea would be that a cleaning crew would be able to look at the scene to determine which areas need extra attention. We can think of plenty of applications that extend beyond the current crisis, as the ability to map areas that have been touched seems to be generally useful.

[Nick] has been getting some mileage out of that Xavier lately — he’s used it to build an AI umpire and shades that help you find lost stuff. Who knows what else he’ll find to do with them during this time of confinement?

Continue reading “Machine Vision Keeps Track Of Grubby Hands”

3D-Printed Film Scanner Brings Family Memories Back To Life

There is a treasure trove of history locked away in closets and attics, where old shoeboxes hold reels of movie film shot by amateur cinematographers. They captured children’s first steps, family vacations, and parties where [Uncle Bill] was getting up to his usual antics. Little of what was captured on thousands of miles of 8-mm and Super 8 film is consequential, but giving a family the means to see long lost loved ones again can be a powerful thing indeed.

That was the goal of [Anton Gutscher]’s automated 8-mm film scanner. Yes, commercial services exist that will digitize movies, slides, and snapshots, but where’s the challenge in that? And a challenge is what it ended up being. Aside from designing and printing something like 27 custom parts, [Anton] also had a custom PCB fabricated for the control electronics. Film handling is done with a stepper motor that moves one frame into the scanner at a time for scanning and cropping. An LCD display allows the archivist to move the cropping window around manually, and individual images are strung together with ffmpeg running on the embedded Raspberry Pi. There’s a brief clip of film from a 1976 trip to Singapore in the video below; we find the quality of the digitized film remarkably good.

Hats off to [Anton] for stepping up as the family historian with this build. We’ve seen ad hoc 8-mm digitizers before, but few this polished looking. We’ve also featured other archival attempts before, like this high-speed slide scanner.

Continue reading “3D-Printed Film Scanner Brings Family Memories Back To Life”

Raspberry Pi Tracks Humans, Blasts Them With Heat Rays

Given how long humans have been warming themselves up, you’d think we would have worked out all the kinks by now. But even with central heating, and indeed sometimes because of it, some places we frequent just aren’t that cozy. In such cases, it often pays to heat the person, not the room, but that can be awkward, to say the least.

Hacking polymath [Matthias Wandel] worked out a solution to his cold shop with this target-tracking infrared heater. The heater is one of those radiant deals with the parabolic dish, and as anyone who’s walked past one on demo in Costco knows, they throw a lot of heat in a very narrow beam. [Matthias] leveraged a previous project that he whipped up for offline surveillance as the core of the project. Running on a Raspberry Pi with a camera, the custom software analyzes images and locates motion across the width of a frame. That drives a stepper that swivels a platform for the heater. The video below shows the build and the successful tests; however, fans of [Matthias] should prepare themselves for a shock as he very nearly purchases a lazy susan to serve as the base for the heater rather than building one.

We’re never disappointed by [Matthias]’ videos, and we’re always impressed by his range as a hacker. From DIY power tools to wooden logic circuits to his recent Lego chocolate engraver, he always finds ways to make things interesting.

Continue reading “Raspberry Pi Tracks Humans, Blasts Them With Heat Rays”

Neural Network Zaps You To Take Better Photographs

It’s ridiculously easy to take a bad photograph. Your brain is a far better Photoshop than Photoshop, and the amount of editing it does on the scenes your eyes capture often results in marked and disappointing differences between what you saw and what you shot.

Taking your brain out of the photography loop is the goal of [Peter Buczkowski]’s “prosthetic photographer.” The idea is to use a neural network to constantly analyze a scene until maximal aesthetic value is achieved, at which point the user unconsciously takes the photograph.

But the human-computer interface is the interesting bit — the device uses a transcutaneous electrical nerve stimulator (TENS) wired to electrodes in the handgrip to involuntarily contract the user’s finger muscles and squeeze the trigger. (Editor’s Note: This project is about as sci-fi as it gets — the computer brain is pulling the strings of the meat puppet. Whoah.)

Meanwhile, back in reality, it’s not too strange a project. A Raspberry Pi watches the scene through a Pi Cam and uses a TensorFlow neural net trained against a set of high-quality photos to determine when to trip the shutter. The video below shows it in action, and [Peter]’s blog has some of the photos taken with it.

We’re not sure this is exactly the next “must have” camera accessory, and it probably won’t help with snapshots and selfies, but it’s an interesting take on the human-device interface. And if you’re thinking about the possibilities of a neural net inside your camera to prompt you when to take a picture, you might want to check out our primer on TensorFlow to get started.

Continue reading “Neural Network Zaps You To Take Better Photographs”