Perfect Jump Shots With OpenCV And Processing

jumpshot

[ElectricSlim] likes taking “Jump Shots” – photographs where the subject is captured in midair. He’s created a novel method to catch the perfect moment with OpenCV and Processing. Anyone who has tried jump shot photography can tell you how frustrating it is. Even with an experienced photographer at the shutter, shots are as likely to miss that perfect moment as they are to catch it. This is even harder when you’re trying to take jump shots solo. Wireless shutter releases can work, but unless you have a DSLR, shutter lag can cause you to miss the mark.

[ElectricSlim] decided to put his programming skills to work on the problem. He wrote a Processing sketch using the OpenCV library. The sketch has a relatively simple logic path: “IF a face is detected within a bounding box AND the face is dropping in height THEN snap a picture” The system isn’t perfect, A person must be looking directly at the camera for the photo the face to be detected. However, it’s good enough to take some great shots. The software is also repeatable enough to make animations of various jump shots, as seen in [ElectricSlim’s] video.

We think this would be a great starting point for a trigger system. Use a webcam to determine when to shoot a picture. When the conditions pass, a trigger could be sent to a DSLR, resulting in a much higher quality frame than what most webcams can produce.

Continue reading “Perfect Jump Shots With OpenCV And Processing”

IMU Boards As Next-gen Motion Capture Suit?

This guy takes a drink and so does the virtual wooden mannequin. Well, its arm takes a drink because that’s all the researchers implemented during this summer project. But the demo really makes us think that suits full of IMU boards are the next generation of motion capture. Not because this is the first time we’ve seen it (the idea has been floating around for a couple of years) but because the sensor chips have gained incredible precision while dropping to bargain basement prices. We can pretty much thank the smartphone industry for that, right?

Check out the test subject’s wrist. That’s an elastic bandage which holds the board in place. There’s another one on this upper arm that is obscured by his shirt sleeve. The two of these are enough to provide accurate position feedback in order to make the virtual model move. In this case the sensor data is streamed to a computer over Bluetooth where a Processing script maps it to the virtual model. But we’ve seen similar 9-axis sensors in projects like this BeagleBone sensor cape. It makes us think it would be easy to have an embedded system like that on the back of a suit which collects data from sensor boards all over the test subject’s body.

Oh who are we kidding? [James Cameron’s] probably already been using this for years.

Continue reading “IMU Boards As Next-gen Motion Capture Suit?”

Hackaday Links: Sunday, July 28th, 2013

hackaday-links-chain

[Chris Gammell] tipped us off that he’s building an online training program for learning electronics. The ten session course will cost money to take but you can get the goods for free if you’re one of the beta testers. We love to listen to The Amp Hour podcast which is just one of [Chris’] many endeavors.

Did you buy a Chromecast this week? We did, but we don’t have it in hand yet (ordered through Amazon). You can still get a look inside from the iFixit teardown.

Practice your Processing skills by using it to code a game of Pong.

A bit of lighter fluid and a hacked insert will get you a flaming wallet. We guess this is a different type of an anti-pickpocket device. [Thanks Stephen]

[Brain] used a $1.50 magnifying lens to help his Raspberry Pi camera module read QR codes better.

We really like [Aaron Christophel’s] LED matrix clock (translated). He started from a marquee that must be at least a decade old. He stripped it down and figured out how to drive it using a Sanguino as a controller.

Modeling Squid Cells In Code Foregoes Connecting Voltage To Animals

[Kemper Smith] built a little piece of nature in Processing. He was inspired by a biology experiment that excited squid cells using electricity. The result is an interactive display that mimics that biology.

Last August we saw a peculiar experiment that forced Cyprus Hill music on the color changing cells of a squid. The cells make colors by stretching sacs of pigment; the larger they get the more of that color is shown. Normally this is used for camouflage. The image on the left is the reaction from connecting headphone wires while music is being played.

But we can’t all get our hands on this type of wet-ware — especially if life far inland. So [Kemper] got to work writing some Processing code. The result is seen on the right. It does a good job of replicating the motion and color palette of the original. He’s put together a web-based demonstration which you can interact with using your mouse cursor. But we also saw him demonstrate a Kinect based version at our local hackerspace.

Continue reading “Modeling Squid Cells In Code Foregoes Connecting Voltage To Animals”

3D Printing Records

3D Printed Record

This is a working record created with a 3D printer. [Amanda] came up with a process that converts audio files into 3D models. These models can be printed and played on a standard record player.

The real work is done by a Processing sketch that creates a STL file. [Amanda] started off by trying to create a sine wave. She used this test to optimize the printing process. Then she used Python to extract audio data from WAV files and modified the processing script to process the data. After more tweaking, she was able to get a reasonable signal to noise ratio and minimize distortion.

The resulting records have a sample rate of 11 kHz and 5-6 bit resolution. The sound quality isn’t going to be the same as commercially pressed vinyl, but you can still make out the song.

Objet Connex 500 was used to print the records. This UV printer has a 600 dpi resolution, which is means it’s more accurate than extrusion printers. Your mileage may vary using different printers, but all of the Processing and Python code is available with the project write up.

After the break, watch [Amanda] spin some 3D printed records.

Continue reading “3D Printing Records”

Dithering In Processing

dithering-experiments

To be honest, we’ve heard of dithering but that’s the extent of our knowledge on the topic. After looking through [Windell’s] post about using Dithering in Processing we can now say we’ve got a base of knowledge on the topic.

Dithering is used to produce an image out of two colors that our eyes can put together into something meaningful. The history of the algorithms goes back to monochrome displays. But now the hobby electronics we work with for fun have comparable computing power and perhaps it’s time to rediscover these techniques. [Windell’s] project implements the Atkinson dithering algorithm in real-time on your webcam. He’s doing this in Processing, which should make it pretty easy to port for your own purposes.

So why might you want to use dithering in your own projects? Because if it can be used to make very cool milled artwork there must be other undiscovered uses lurking around your workshop.