Catching Drops Of Water With LEDs

rig_circuit
Ever wonder how they capture seemingly perfectly timed photographs of water droplets? Most of the time it’s done by using an optointerrupter whereby it detects the droplet falling and then triggers a light source a few milliseconds later with your camera ready and waiting.

This is typically done with something called an air gap flash, which is usually rather expensive or difficult to make, but [Michal’s] figured out another easier way suitable for some applications — using an array of LEDs to illuminate the scene.

He’s got a IR diode, a photo-resistor, a few spacers, some plastic and  a bunch of hot glue to make up his optointerrupter. When the droplet passes through the IR beam it breaks the signal from the photo-resistor which then triggers his ATmega48P. It waits 80 milliseconds (he timed it out) and then turns on the LEDs for approximately 50 microseconds. Meanwhile his camera is watching the whole event with a shutter-speed of a few seconds.

This works because LEDs have rise and fall times that are much shorter than a traditional camera flash — normal flashes light up for 1-2 milliseconds, as opposed to this 50 microsecond LED flash. Just take a look at some of the pictures!

Continue reading “Catching Drops Of Water With LEDs”

Hyperspectral Imaging With A DSLR

It’s a relatively simple task to find evidence of helium by just looking at the sun; all you need is a prism, diffraction grating, and a web cam. DIY spectrometers have been around for ages, but most of them only produce a spectrum, not a full image complete with spectral data. Now it’s possible to take an image of an object, complete with that objects spectra using a DSLR, some lenses, a PVC pipe, and the same diffraction grating from your DIY interferometer.

The idea behind a hyperspectral imager is to gather the spectral data of each pixel of an image. The spectral data is then assembled into a 3D data cube, with two dimensions dedicated to the image, and the third dimension used to represent wavelength. There are a surprising number of applications for this technique, ranging from agriculture and medicine to some extremely creepy surveillance systems.

The authors of this paper (freakin’ huge PDF) used a piece of PVC pipe, three camera lenses, a diffraction grating, and a small paper aperture to construct their hyperspectral imager. Images are captured using a standard, multi exposure HDR method, assembling the raw data from the camera into a hyperspectral image with MATLAB.

There’s a ton of awesome info in the PDF, covering how the authors calibrated their system for different lighting conditions, interpreted the RGGB Bayer sensor in the camera, and a few examples of what kind of image can be constructed with this kind of data. That’s a recommended read, right there.

Thanks [Yannick] for the tip.

Samsung NX300 Gets Rooted

sammy

[Ge0rg] got himself a fancy new Samsung NX300 mirrorless camera. Many of us would just take some pretty pictures, but not [Ge0rg], he wanted to see what made his camera tick. Instead of busting out the screwdrivers, he started by testing his camera’s security features.

The NX300 is sold as a “smart camera” with NFC and WiFi connectivity. The NFC connectivity turns out to be just an NXP NTAG203 tag embedded somewhere in the camera. This is similar to the NFC tags we gave away at The Gathering in LA. The tag is designed to launch an android app on a well equipped smartphone. The tag can be write-locked, but Samsung didn’t set the lock bit. This means you can reprogram and permanently lock the tag as a link to your favorite website.

[Ge0rg] moved on to the main event, the NX300’s WiFi interface. A port scan revealed the camera is running an unprotected X server and Enlightenment. Let that sink in for a second. The open X server means that an attacker can spoof keystrokes, push images, and point applications to the camera’s screen.

In a second blog post, [Ge0rg] tackled attaining root access on the camera. Based on the information he had already uncovered, [Ge0rg] knew the camera was running Linux. Visiting Samsung’s open source software center to download the open source portions of the NX300 confirmed that. After quite a bit of digging and several red herrings, [Ge0rg] found what he was looking for. The camera would always attempt to run an autoexec.sh from the SD Card’s root folder at boot. [Ge0rg] gave the camera the script it was looking for, and populated it with commands to run BusyBox’s telnet daemon.  That’s all it took – root shell access was his.

 

[Image via Wikimedia Commons/Danrok]

Making Manual Lens Flares With A Few Simple Parts

DIY Lens Flare

If you’re an aspiring film maker hoping to be the next [J.J. Abram] with a mild (severe?) obsession with lens flares, then this Instructable is for you!

Modern camera lenses are designed to prevent lens flare, but sometimes, just sometimes, you want a cool lighting flare in your video. Of course you could add them in post production, but that’s kind of cheating, and if you don’t have expensive video editing software, not very easy to do either.

Now you could just throw a super bright LED flashlight on set and hope for the best, but you’ll never get that cool Star Wars or Star Trek blinding purple line… unless you add something on your camera to help scatter the light! [Jana Marie] has figured out just how to do that. Continue reading “Making Manual Lens Flares With A Few Simple Parts”

Underwater GoPro Hero 2 Sees Clearly Again

go pro hack

GoPros are great action cameras for snagging photos and videos places where you can’t normally bring real camera gear. The problem is, even with the waterproof GoPro case for the Hero 2 — the underwater videos tend to be blurry and out of focus. Unsatisfied with his videos, [Mitchell] decided to make his own lens for the case!

The waterproof case has a removable concave lens, but for whatever reason it’s not very good underwater. Lucky for [Mitchell], it’s quite easily removed with 6 screws, revealing a nice thick gasket and the lens. Instead of trying to go fancy with some glass element from a broken camera, he’s just taken some 1/4″ plexiglass and cut out a piece to fit the case. It was a bit too thick for the original configuration, so he’s actually flipped the retaining ring upside down to space the lens away from the actual camera. A bunch of silicone later and the case is waterproof again with a new lens!

The resulting footage with the new lens looks awesome underwater — take a peek after the break.

Continue reading “Underwater GoPro Hero 2 Sees Clearly Again”

Using The Raspberry Pi To See Like A Bee

Bee

The Raspberry Pi board camera has a twin brother known as the NoIR camera, a camera without an infrared blocking filter that allows anyone to take some shots of scenes illuminated with ‘invisible’ IR light, investigate the health of plants, and some other cool stuff. The sensor in this camera isn’t just sensitive to IR light – it goes the other way as well, allowing some investigations into the UV spectrum, and showing us what bees and other insects see.

The only problem with examining the UV spectrum with a small camera is that relatively, the camera is much more sensitive to visible and IR than it is to UV. To peer into this strange world, [Oliver] needed a UV pass filter, a filter that only allows UV light through.

By placing the filter between the still life and the camera, [Oliver] was able to shine a deep UV light source and capture the image of a flower in UV. The image above and to the right isn’t what the camera picked up, though – bees cannot see red, so the green channel was shifted to the red, the blue channel to the green, and the UV image was placed where the blue channel once was.

Continue reading “Using The Raspberry Pi To See Like A Bee”

Eye Tracking With The Oculus Rift

ocu

There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.

Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.

To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.

Yes, using a computer to know where you’re looking may seem like a rather useless build, but stuff like this is used in research and extraordinarily high tech heads up displays. Although he’s not using the motion tracking on the Oculus, if [Diako] were to do so, he’d have the makings of one of the most powerful heads up displays possible.

Continue reading “Eye Tracking With The Oculus Rift”