Technology vanishes. It either succeeds and becomes ubiquitous or fails. For example, there was a time when networking and multimedia were computer buzzwords. Now they are just how computers work. On the other hand, when was the last time you thought about using a CueCat barcode reader to scan an advertisement? Then there are the things that have their time and vanish, like pagers. It is hard to decide which category digital cameras fall into. They are being absorbed into our phones and disappearing as a separate category for most consumers. But have you ever wondered about the first digital camera? The story isn’t what you would probably guess.
The first digital camera I ever had was a Sony that took a floppy disk. Surely that was the first, right? Turns out, no. There were some very early attempts that didn’t really have the technology to make them work. The Jet Propulsion Laboratory was using analog electronic imaging as early as 1961 (they had been developing film on the moon but certainly need a better way). A TI engineer even patented the basic outline of an electronic camera in 1972, but it wasn’t strictly digital. None of these bore any practical fruit, especially relative to digital technology. It would take Eastman Kodak to create a portable digital camera, even though they were not the first to commercialize the technology.
Continue reading “Dawn of the First Digital Camera”
Most of us, if we have bought a single board computer with the capability to support a camera, will have succumbed to temptation and shelled out for that peripheral in the hope that we can coax our new toy into having sight. We’ll have played with the command line tool and taken a few random images of our bench, but then what? There is so much possibility in a camera that our colleague [Steven Dufresne] wanted to explore with his Raspberry Pi, so he built a motorised eyeball mount with which to do so.
Pan & tilt mounts using RC servos are nothing especially new, but in this one he’s put some design effort that maybe some of the others lack. A lot of effort has gone in to ensuring no interference between the two axes, and in a slightly macabre twist until you remember it’s a model he’s talking about, the unit has been designed to fit inside a human head.
The servos are driven from the Pi using a servo driver board he’s discussed in another video, so once he’s described the assembly with a few design tweaks thrown in he has a quick look at the software demo he’s written emulating neurons for eye tracking. He promises that will be put up somewhere for download in due course.
If you’re in the market for a pan & tilt mount for your Pi, this one could make a lot of sense to throw at your 3D printer. It’s certainly more accomplished than this previous one we’ve shown you.
Continue reading “Feast Your Eyeballs On This Mechanical Eyeball”
Light painting: there’s something that never gets old about waving lights around in a long exposure photo. Whilst most light paintings are single shots, some artists painstakingly create frame-by-frame animations. This is pretty hard to do when moving a light around by hand: it’s mostly guesswork, as it’s difficult to see the results of your efforts until after the photo has been taken. But what if you could make the patterns really precise? What if you could model them in 3D?
[Josh Sheldon] has done just that, by creating a process which allows animations formed in Blender to be traced out in 3D as light paintings. An animation is created in Blender then each frame is automatically exported and traced out by an RGB LED on a 3D gantry. This project is the culmination of a lot of software, electronic and mechanical work, all coming together under tight tolerances, and [Josh]’s skill really shines.
The first step was to export the animations out of Blender. Thanks to its open source nature, Python Blender add-ons were written to create light paths and convert them into an efficient sequence that could be executed by the hardware. To accommodate smooth sliding camera movements during the animation, a motion controller add-on was also written.
The gantry which carried the main LED was hand-made. We’d have been tempted to buy a 3D printer and hack it for this purpose, but [Josh] did a fantastic job on the mechanical build, gaining a solidly constructed gantry with a large range. The driver electronics were also slickly executed, with custom rack-mount units created to integrate with the DragonFrame controller used for the animation.
The video ends on a call to action: due to moving out, [Josh] was unable to continue the project but has done much of the necessary legwork. We’d love to see this project continued, and it has been documented for anyone who wishes to do so. If you want to check out more of [Josh]’s work, we’ve previously written about that time he made an automatic hole puncher for music box spools.
Thanks for the tip, [Nick].
Continue reading “Light Painting Animations Directly From Blender”
Prior to this weekend I had assumed making holograms to be beyond the average hacker’s reach, either in skill or treasure. I was proven wrong by a Club-Mate box full of electronics, and an acrylic jig perched atop an automotive inner tube. At the Hope Conference, Tommy Johnson was sharing his hacker holography in a workshop that let a few lucky attendees make their own holograms on site!
The technique used here depends on interference patterns rather than beam splitting. A diffused laser beam is projected through holographic film onto the subject of the hologram — say a bouquet of flowers like in the video below. Photons from that beam reflect from the bouquet and pass back through the film a second time. Since light is a form of electromagnetic radiation that travels as a wave, anywhere that two peaks (one from the beam the other from the reflected light) align on the film, exposure occurs. With just a 1/2 second exposure the film is ready to be developed, and if everything went right you have created a hologram.
Simple, right? In theory, at least. In practice Tommy’s been doing this for nearly 30 years and has picked up numerous tips along the way. Let’s take a look at the hardware he brought for the workshop.
Continue reading “HOPE XII: Make Your Own Holograms”
Readers with long memories will remember the days when mice and other similar pointing devices relied upon a hard rubber ball in contact with your desk or other surface, that transmitted any motion to a pair of toothed-wheel rotation sensors. Since the later half of the 1990s though, your rodent has been ever significantly more likely to rely upon an optical sensor taking the form of a small CCD camera connected to motion sensing electronics. These cameras are intriguing components with applications outside pointing devices, as is shown by [FoxIS] who has used one for robot vision.
The robot in question is a skid-steer 4-wheeled toy, to which he has added an ADNS3080 mouse sensor fitted with a lens, an H-bridge motor driver board, and a Wemos D1 Mini single board computer. The D1 serves a web page showing both the image from the ADNS3080 and an interface that allows the robot to be directed over a network connection. A pair of LiPo batteries complete the picture, with voltage monitoring via one of the Wemos analogue pins.
The ADNS3080 is an interesting component and we’d love see more of it. This laser distance sensor or perhaps this car movement tracker should give you some more info. We’ve heard rumors of them being useful for drones. Anyone?
Camera sliders are a popular build, and properly executed they can make for impressive shots for both time-lapse sequences or real-time action. But they seem best suited for long shots, as dollying a camera in a straight line just moves subjects close to the camera through the frame.
This slider with both pan and tilt axes can make moving close-ups a lot easier. With his extremely detailed build log, [Dejan Nedalkovski] shows how he went about building his with only the simplest of materials and tools. The linear rail is simply a couple of pieces of copper pipe supported by an MDF frame. The camera trolley rides the rails on common skateboard bearings and is driven by a NEMA-17 stepper, as are the pan and tilt axes. [Dejan] also provided a barn-door style pivot to tilt the camera relative to the rails, allowing the camera to slide up and down an inclined plane for really interesting shots. The controller uses an Arduino and a joystick to drive the camera manually, or the rig can be programmed to move smoothly between preset points.
This is a step beyond a simple slider and feels a little more like full-blown motion control. We’ve got a feeling some pretty dramatic shots would be possible with such a rig, and the fact that it’s a simple build is just icing on the cake.
Continue reading “Simple Camera Slider Adds a Dimension or Two to Your Shots”
Thermal cameras hold an enduring fascination as well as being a useful tool for the engineer. After all, who wouldn’t want to point one at random things around the bench, laughing with glee at finding things warmer or colder than expected? But they’ve always been so expensive, and a lot of the efforts that have sought to provide one for little outlay have been rather disappointing.
This has not deterred [Offer] though, who has made an extremely professional-looking thermal camera using an M5Stack ESP32-based computer module and an AMG8833 thermal sensor array module in a 3D-printed case that copies those you’d find on a commercial unit. The modular approach makes it a simple prospect for the constructor, the software can be found on GitHub, and the case files are hosted on Thingiverse. You’ll be finding warm and cold things on your bench in no time, as the video below shows.
Most of the thermal cameras we’ve seen have centred upon the FLIR Lepton module, but that’s a component that remains expensive. This project shows us that thermal cameras are a technology that is slowly becoming affordable, and that greater things are to come.
Continue reading “Who Said Thermal Cameras Weren’t Accessible To The Masses?”