Stereo photography has been around for almost as long as photography itself, and it remains a popular way to capture a scene in its 3D glory. Yet despite the fact that pretty much everyone carries one or more cameras with them every day in the form of a smartphone, carrying a stereo photography-capable system with you remains tricky. As [Pascal Martiné] explains in a How-To article, although you can take two smartphones with you, syncing up both cameras to get a stereo image isn’t so straightforward, even though this is essential if you want to prevent jarring shifts between the left and right image.
Fortunately, having two of the exact same smartphone with the exact same camera modules is not an absolute requirement, as apps like i3DStereoid offer auto-adjustments. But activating the camera trigger on each phone is essential. The usual assortment of wireless remote triggers don’t work well here, and the twin-pairing in i3DStereoid had too much delay for dynamic scenes. This left the wired remote trigger option, but with a dearth of existing stereo trigger options [Pascal] was forced to make his own for two iPhones out of Apple Lightning cables and wired earbud volume controls.
Although the initial prototype more or less worked, [Pascal] found that each iPhone would often ‘decide’ to release the trigger at a slightly different time, requiring multiple attempts at the perfect shot. This led him down a rabbit hole of investigating different camera apps and configurations to make shutter delay as deterministic as possible. Much of this turned out to be due to auto exposure and auto focus, with enabling AE/AF lock drastically increasing the success rate, though this has to be done manually before each shot as an extra step.
With this one tweak, he found that most of the stereo photo pairs are now perfectly synced, while occasionally there is about a ~3 ms jitter, the cause of which he hasn’t tracked down yet, but which could be due to the camera app or iOS being busy with something else.
In the end, this iPhone-based stereo photography setup might not be as reliable or capable as some of the purpose-built rigs we’ve covered over the years, but it does get extra points for portability.
[TeachingTech] has a video covering the OpenScan Mini that does a great job of showing the workflow, hardware, and processing method for turning small objects into high-quality 3D models. If you’re at all interested but unsure where or how to start, the video makes an excellent guide.
We’ve covered the OpenScan project in the past, and the project has progressed quite a bit since then. [TeachingTech] demonstrates scanning a number of small and intricate objects, including a key, to create 3D models with excellent dimensional accuracy.
[Thomas Megel]’s OpenScan project is a DIY project that, at its heart, is an automated camera rig that takes a series of highly-controlled photographs. Those photographs are then used in a process called photogrammetry to generate a 3D model from the source images. Since the quality of the source images is absolutely critical to getting good results, the OpenScan hardware platform plays a pivotal role.
Once one has good quality images, the photogrammetry process itself can be done in any number of ways. One can feed images from OpenScan into a program like Meshroom, or one may choose to use the optional cloud service that OpenScan offers (originally created as an internal tool, it is made available as a convenient processing option.)
It’s really nice to have a video showing how the whole workflow works, and highlighting the quality of the results as well as contrasting them with other 3D scanning methods. We’ve previously talked about 3D scanning and what it does (and doesn’t) do well, and the results from the OpenScan Mini are fantastic. It might be limited to small objects, but it does a wonderful job on them. See it all for yourself in the video below.
The aurora borealis (and its southern equivalent, the aurora australis) is a fleeting and somewhat rare phenomenon that produces vivid curtains of color in the sky at extreme latitudes. It’s a common tourist activity to travel to areas where the aurora is more prevalent in order to catch a glimpse of it. The best opportunities are in the winter though, and since most people don’t want to spend hours outside on a cold night night in high latitudes, an all-sky camera like this one from [Frank] can help notify its users when an aurora is happening.
Because of the extreme temperatures involved, this is a little more involved than simply pointing a camera at the sky and hoping for the best. The enclosure and all electronics need to be able to withstand -50°C and operate at at least -30. For the enclosure, [Frank] is going with PVC tubing with a clear dome glued into a top fits to the end of the pipe, providing a water-resistant enclosure. A Raspberry Pi with a wide-angle lens camera sits on a 3D printed carriage so it can easily slide inside. The electronics use power-over-ethernet (PoE) rather than a battery due to the temperature extremes, which conveniently provides networking capabilities for viewing the images.
This is only part one of this build — in part two [Frank] is planning to build a system which can use this camera assembly to detect the aurora automatically and send out notifications when it sees it. Watching the night sky from the comfort of a warm house or sauna isn’t the only reason for putting an all-sky camera to use, either. They can also be used to observe meteors as they fall and then triangulate the position of the meteorites on the ground.
If your pointing device is a mouse, turn it over. The chances are you’ll see a red LED light if you’re not seriously old-school and your mouse has a ball, this light serves as the illumination for a very simple camera sensor. The mouse electronics do their thing by looking for movement in the resulting image, but it should be possible to pull out the data and repurpose the sensor as a digital camera. [Doctor Volt] has a new video showing just that with the innards of a Logitech peripheral.
The mouse contains a microcontroller and the camera part, which fortunately has an SPI interface. The correct register to query the sensor information was deduced, and as if my magic, an image appeared. An M12 lens provided focus with a handy 3D printed mount, and the board went back into the mouse case as a housing. The pictures have something of the Game Boy camera about them, being low-res and monochrome, but it’s still a neat hack.
As the art of film photography has gained once more in popularity, some of the accessories from a previous age have been reinvented, as is the case with [tdsepsilon]’s radar rangefinder. Photographers who specialized in up-close-and-personal street photography in the mid-20th century faced the problem of how to focus their cameras. The first single-lens reflex cameras (SLRs) were rare and expensive beasts, so for most this meant a mechanical rangefinder either clipped to the accessory shoe, or if you were lucky, built into the camera.
The modern equivalent uses an inexpensive 24 GHz radar module coupled to an ESP32 board with an OLED display, and fits in a rather neat 3D printed enclosure that sits again in the accessory shoe. It has a 3 meter range perfect for the street photographer, and the distance can easily be read out and dialed in on the lens barrel.
Whenever the revival of film photography is discussed, it’s inevitable that someone will ask why, and point to the futility of using silver halides in a digital age. It’s projects like this one which answer that question, with second-hand SLRs being cheap and plentiful you might ask why use a manual rangefinder over one of them, but the answer lies in the fun of using one to get the perfect shot. Try it, you’ll enjoy it!
If you’ve ever been concerned about privacy in a rental space or hotel room, you might have considered trying one of the many “spy camera detectors” sold online. In the video after break [Big Clive], tears one down and gives us an in-depth look at how these gadgets actually work, and their limitations.
Most detector follow the same basic design: a ring of LEDs through which the user inspects a room, looking for reflections indicating a potential hidden camera. Although this device can help spot a camera, it’s not entirely foolproof. The work best when you’re close to the center of a camera’s field of view, and some other objects, like large LEDs can produce similar reflections
The model examined in this video takes things one step further by adding a disc of dichroic glass. Coated with a metallization layer close to the wavelength of the LEDs, it effectively acts a bandpass filter, reducing reflections from other light sources. [Big Clive] also does his customary reverse-engineering of the circuit, which is just a simple flasher powered by USB-C.
What do you get when you join a slide projector and a digital camera? Filmolimo, an open source slide scanner. The scanner uses an M5Stack Fire, an ESP32 development board. Thanks to the ESP32, you can control the device via WiFi.
All the project files, including KiCAD design files, are on GitHub. Of course, you will probably have to adapt things to your specific camera and slide projector. The PCB is double-sided and looks easy to put together. The board is mostly opto-isolation and interface between the controller and the equipment. The software allows you to change things like the time between slides, for example.
This is one of those projects you probably only need for a bit. Unless, of course, you regularly scan slides. You can farm it out to a service provider, but what fun is that? If you have a few hundred thousand slides, you might need to go for speed. If you just have a few, you can get by with a simple adapter.