[ElectricSlim] likes taking “Jump Shots” – photographs where the subject is captured in midair. He’s created a novel method to catch the perfect moment with OpenCV and Processing. Anyone who has tried jump shot photography can tell you how frustrating it is. Even with an experienced photographer at the shutter, shots are as likely to miss that perfect moment as they are to catch it. This is even harder when you’re trying to take jump shots solo. Wireless shutter releases can work, but unless you have a DSLR, shutter lag can cause you to miss the mark.
[ElectricSlim] decided to put his programming skills to work on the problem. He wrote a Processing sketch using the OpenCV library. The sketch has a relatively simple logic path: “IF a face is detected within a bounding box AND the face is dropping in height THEN snap a picture” The system isn’t perfect, A person must be looking directly at the camera for the photo the face to be detected. However, it’s good enough to take some great shots. The software is also repeatable enough to make animations of various jump shots, as seen in [ElectricSlim’s] video.
We think this would be a great starting point for a trigger system. Use a webcam to determine when to shoot a picture. When the conditions pass, a trigger could be sent to a DSLR, resulting in a much higher quality frame than what most webcams can produce.
Continue reading “Perfect Jump Shots with OpenCV and Processing”
The graphic above wasn’t painstakingly stitched together by rotating a camera lens on a lazy suzan a tiny bit, taking a picture, and repeating the process fifty times. This is high tech stuff, courtesy of Zcapture, a tool for automated 360 degree photography of small objects.
For the last 15 years, [Jared] has been spending a lot of time on eCommerce and found existing solutions to displaying products online to be very lacking. After playing around with the Basic Stamp eight years ago and most recently the Arduino, [jared] decided he would build something to solve his problem – an automated box that takes pictures of a rotating product.
Inside the Zcapture is an Arduino connected to a motor and the software to control Canon and Nikon DSLRs. Put the Zcapture in a soft box, light it up, set up your camera, and you have a computer-controlled lazy suzan robot that will take pictures of any object, then stitch them together into an animated GIF or a fancy eCommerce rotating image viewer
Want to try out aerial photography, but can’t afford a quadcopter? [Jeremy] rigged up a low cost GoPro Slingshot and took some pretty nice flyover shots of the lake.
The slingshot itself is meant for water balloons, but easily has enough power to fire the camera. In order to get good video, some stabilization was needed. [Jeremy] made a stabilizing fin out of packaging foam, and used an eye bolt to connect it to the GoPro’s threaded tripod mount. The simple tail fin made of out foam and zip ties actually did a good job of stabilizing the camera.
This looks like a fun experiment to try when you’re at the lake, since you can probably build it with stuff lying around the house. For [Jeremy], it also proved to be a way to keep his dog entertained since she retrieved the camera after each shot. After the break, check out the video footage from the GoPro slinging rig.
Continue reading “GoPro Slingshot”
[Doog] builds plastic models, and like anyone who makes really small stuff, he needed a good photo booth to show off his wares and techniques. He was working with the very common ‘poster board and work light’ setup we’ve all put together, but after photoshopping seam lines one too many times, he decided to upgrade his booth to something a little better.
The new setup consists of an aluminum frame with a 40×80 inch sheet of translucent plexiglass forming the bottom and backdrop of the booth. Two lights in diffuser bags illuminate the subject from the top, while the old worklights are attached to the bottom of the table frame to light the subject from beneath.
Compared to the ‘poster board and work light’ technique of the past, [Doog]’s new photo booth is absolutely incredible for taking pictures of very small things. This model of a Spitfire looks like it’s floating and this snap of a Thunderbolt is good enough to grace magazine covers.
Of course this photobooth isn’t just limited to models, so if you’re looking at taking some pictures of hand-soldered BGA circuits in the future, you may want to think about upgrading your studio setup.
Since the 70s, NASA, NOAA, and the USGS have been operating a series of satellites designed to look at vegetation health around the world. These satellites, going under the name Landsat, use specialized camera filters that look at light reflecting off chlorophyll to gauge the health of forests, plains, oceans, and even farms. It’s all very interesting technology, and a few very cool people want to put one of these near infrared cameras in the hands of everyone.
The basic idea behind gauging the health of plants from orbit, or the Normalized Difference Vegetation Index, is actually pretty simple: absorb red and blue light (thus our verdant forests), and reflect nearly all infrared light. By removing the IR filter from a digital camera and adding a ‘superblue’ filter, the NDVI can be calculated with just a little bit of image processing.
The folks behind this have put up a Kickstarter with rewards including a modified webcam, a custom point and shoot camera, and a very low-cost source of one of these superblue filters. Just the thing to see how your garden grows or how efficiently you can kill a houseplant.
With high-speed cameras you’re able to see bullets passing through objects, explosions in process, and other high-speed phenomena. Rarely, though, are you able to see what happens when light shines on an object without hundreds of thousands of dollars worth of equipment. A group of researchers at The University of British Columbia are doing just that with hardware that is well within the range of any home tinkerer.
Making videos of light passing through and around objects has been done before (great animated gifs of that here), but the equipment required of previous similar projects cost $300,000 and couldn’t be used outside the controlled environment of a lab. [Matthias] and his team put together a similar system for about $1,000. The only hardware required is an off-the-shelf 3D time of flight camera and a custom driver powering six red laser diodes.
Aside from having a much less expensive setup than the previous experiments in recording the flight of a pulse of light, [Matthias] and his team are also able to take their and record the flight of light in non-labratory settings. They’ll be doing just that at this year’s SIGGRAPH conference, producing videos of light reflecting off attendee-produced objects in just a few minutes. You can check out the video for the project below.
If you have ever played around with macro photography, you’ll know how hard it is to get a focused image of something that isn’t two-dimensional. For virtually every 3D object, you’ll have to deal with the depth of field – the small region where things are actually in focus. [David] came up with a neat homebrew solution for making sure everything in his macro photos is in focus using a discarded flatbed scanner and a Raspberry Pi.
[David]’s technique relies on focus stacking. Basically, [David] takes dozens of images of the same object, moving the camera closer by a fraction of an inch before snapping each frame. These pictures are stitched together with CombineZ, a piece of software used for extending the depth of field in images.
The hardware part of the build is a Raspberry Pi hooked up to a stepper motor driver and the shutter button of [David]’s camera. By attaching his camera to the carriage of a flatbed scanner, [David] can inch his camera ever closer to his object of study while grabbing the images for CombineZ.
The results are impressive, and would be nearly impossible to replicate any other way without tens of thousands of dollars in camera equipment.