Creating Video Trails In OpenCV

The video trail effect is nothing new: it was first used in music videos like “Blame it on the boogie” from the Jackson 5 in 1978. Now,  [Antonio Ospite] has put together a nice article that shows the basics of using OpenCV to create this effect in live video. He used the open source video processing package OpenCV for this, creating the effect with a short script. It can run in multiple ways, creating video trail effects, or “catch-up”trails (where the trail reverses into a final frame).

This provides an interesting example of how these video effects have become so much easier to create. The Jackson 5 video was created using a Scanimate and Quantel Paintbox system that was as big as a closet and cost hundreds of thousands of dollars. Now, you can create these effects with free software and a cheap PC. Now you just need to figure out what in our modern world looks awesome with this throwback effect.

Continue reading “Creating Video Trails In OpenCV”

Getting Biometrics in Hand

It is amazing how quickly you get used to a car that starts as long as you have the key somewhere on your person. When you switch vehicles, it becomes a nuisance to fish the key out and insert it into the ignition. Biometrics aims to make it even easier. Why carry around a key (or an access card), if a computer can uniquely identify you?

[Alexis Ospitia] wanted to experiment with vein matching biometrics and had good results with a Raspberry Pi, a web cam, and a custom IR illumination system. Apparently, hemoglobin is a good IR reflector and the pattern of veins in your hand is as unique as other biometrics (like fingerprints, ear prints, and retina vein patterns). [Alexis’] post is in Spanish, but Google Translate does a fine job as soon as you realize that it thinks “fingerprint” is “footprint.” The software uses OpenCV, but we’ve seen the same thing done in MATLAB (see the video below).

Continue reading “Getting Biometrics in Hand”

Hackaday Prize Semifinalist: Picking Up Litter With Robots

On beaches, in parks, and in [BDM]’s back yard, there’s a lot of liter everywhere. The normal solution to this problem is to hire someone or find some volunteers to pick up all this trash. We’re living in the future, though, and that means robots. For his Hackaday Prize entry, [BDM] is building a robot that picks up trash.

A robot that picks up litter is a very, very interesting problem. It can’t be controlled by a person, or else it would be more efficient to just get out there and kill your back picking up bottles. This means it must work autonomously, and that means identifying litter, picking it up, and disposing of it.

For the identification part of the problem, [BDM] is using computer vision that captures an RGB image and discriminates against natural objects. Right now the computer vision is far from perfect, but it does a very good job, all things considering.

The next biggest problem is picking the trash up and disposing of it. For this, [BDM] has repurposed a Power Wheels and attached a DIY robot arm. It’s not a very powerful arm, and a children’s toy probably isn’t the best platform, but it is the start of something very, very cool.

You can check out [BDM]’s video for the project below.

The 2015 Hackaday Prize is sponsored by:

Continue reading “Hackaday Prize Semifinalist: Picking Up Litter With Robots”

Hackaday Links: August 16, 2015

[Matt] created an animated gif of New Horizon’s Pluto flyby. The source images were taken from the the raw LORRI images, modified so the background star field could be seen, and assembled with OpenCV. Because Pluto and Charon orbit each other around a point above Pluto’s surface, simply putting Pluto in the center of each frame wouldn’t work. It’s the best visual explanation of this weird arrangement yet, all brought to you by the magic of OpenCV and Python.

On the subject of Kickstarter creators that don’t understand the conservation of energy, I present this.

We don’t know exactly what’s going on with this one, but here’s a swimming pool covered with RGB LEDs. It’s controlled by two Rainbowduinos, and looks like the coolest disco floor you’ve ever seen.

[Frank]’s 2011 Hundai Santa Fe wasn’t cool enough, so he added an F16 flight stick to his shift knob. The choice of joystick is paramount here: Saitek joysticks look too techy, Logitech ones are too expensive, and the Warthog H.O.T.A.S costs $400. Joysticks are extremely niche peripherals these days, it seems. He ended up strapping an old F16 joystick from the 90s on his shift knob, and it looks close enough to the real thing.

Two bodgers are stuffing the engine from a Toyota Celica into a 1980 Mini, and they’re trying to make it look stock. We’ve seen their project before, and now there’s a new episode. In this episode: the pedal box, the steering wheel, and figuring out how to make the car drive straight.

Hackaday Prize Entry: An Open Source Industrial Camera

Over the last few years, connecting a camera to the Internet has gotten cheaper and cheaper. The advances that made this possible did not come through security cameras, but instead tiny cell phone camera modules, ARM boards, and embedded computing. Right now, if you want a livestream of your back yard, you’d probably get a Raspberry Pi and camera module. This will work for 90% of cases, but what if you want to livestream a slightly harsher environment? What if you want image processing right on the camera? What if you want this camera to have a rating for environmental protection?

[Apodiant]’s entry for the 2015 Hackaday Prize is solving the latter problem. It’s an Open Source Industrial Smart Camera with Ethernet, USB, and serial outputs, an ARM CPU for image processing, all tucked away in a sturdy aluminum enclosure.

The preliminary BOM for this camera is an iMX6 – a very capable microcontroller that can run Linux and OpenCV. The image sensor is a 1.2 megapixel unit [Apodiant] already has experience with, and the enclosure is an off the shelf deal for anyone who wants to build their own.


If this sort of setup sounds familiar, you’re right: there have been a few projects that have taken camera modules, added a powerful microcontroller, and run image processing on them. The latest in a long line of these projects is the OpenMV. That had a successful Kickstarter, and since [Apodiant] is going for the Hackaday Prize Best Product competition, it looks like a good fit.

The 2015 Hackaday Prize is sponsored by:

Googly Eyes Follow You Around the Room

If you’re looking to build the next creepy Halloween decoration or simply thinking about trying out OpenCV for the first time, this next project will have you covered. [Glen] made a pair of giant googly eyes that follow you around the room using some servos and some very powerful software.

The project was documented in three parts. In Part 1, [Glen] models and builds the eyes themselves, including installing the servo motors that will eventually move them around. The second part involves an Arduino and power supply that will control the servos, and the third part goes over using OpenCV to track faces.

This part of the project is arguably the most interesting if you’re new to OpenCV; [Glen] uses this software package to recognize different faces. From there, the computer picks out the most prominent face and sends commands to the Arduino to move the eyes to the appropriate position. The project goes into great detail, from Arduino code to installing Ubuntu to running OpenCV for the first time!

We’ve featured some of [Glen]’s projects before, like his FPGA-driven LED wall, and it’s good to see he’s still making great things!

Continue reading “Googly Eyes Follow You Around the Room”

Eye-Controlled Wheelchair Advances from Talented Teenage Hackers

[Myrijam Stoetzer] and her friend [Paul Foltin], 14 and 15 years old kids from Duisburg, Germany are working on a eye movement controller wheel chair. They were inspired by the Eyewriter Project which we’ve been following for a long time. Eyewriter was built for Tony Quan a.k.a Tempt1 by his friends. In 2003, Tempt1 was diagnosed with the degenerative nerve disorder ALS  and is now fully paralyzed except for his eyes, but has been able to use the EyeWriter to continue his art.

This is their first big leap moving up from Lego Mindstorms. The eye tracker part consists of a safety glass frame, a regular webcam, and IR SMD LEDs. They removed the IR blocking filter from the webcam to make it work in all lighting conditions. The image processing is handled by an Odroid U3 – a compact, low cost ARM Quad Core SBC capable of running Ubuntu, Android, and other Linux OS systems. They initially tried the Raspberry Pi which managed to do just about 3fps, compared to 13~15fps from the Odroid. The code is written in Python and uses OpenCV libraries. They are learning Python on the go. An Arduino is used to control the motor via an H-bridge controller, and also to calibrate the eye tracker. Potentiometers connected to the Arduino’s analog ports allow adjusting the tracker to individual requirements.

The web cam video stream is filtered to obtain the pupil position, and this is compared to four presets for forward, reverse, left and right. The presets can be adjusted using the potentiometers. An enable switch, manually activated at present is used to ensure the wheel chair moves only when commanded. Their plan is to later replace this switch with tongue activation or maybe cheek muscle twitch detection.

First tests were on a small mockup robotic platform. After winning a local competition, they bought a second-hand wheel chair and started all over again. This time, they tried the Raspberry Pi 2 model B, and it was able to work at about 8~9fps. Not as well as the Odroid, but at half the cost, it seemed like a workable solution since their aim is to make it as cheap as possible. They would appreciate receiving any help to improve the performance – maybe improving their code or utilising all the four cores more efficiently. For the bigger wheelchair, they used recycled car windshield wiper motors and some relays to switch them. They also used a 3D printer to print an enclosure for the camera and wheels to help turn the wheelchair. Further details are also available on [Myrijam]’s blog. They documented their build (German, pdf) and have their sights set on the German National Science Fair. The team is working on English translation of the documentation and will release all design files and source code under a CC by NC license soon.