Texel: Art Tracks You, Tracks Time

French robot-artist [Lyes Hammadouche]  tipped us off to one of his latest works: a collaboration with [Ianis Lallemand] called Texel. A “texel” is apparently a time-pixel, and the piece consists of eight servo-controlled hourglasses that can tip themselves over in response to viewers walking in front of them. Besides making graceful wavelike patterns when people walk by, they also roughly record the amount of time that people have spent looking at the piece — the hourglasses sit straight up when nobody’s around, resulting in a discrete spatial representation of people’s attentions to the piece: texels.

We get jealous when we see artists playing around with toys like these. Texel uses LIDAR scanners, Kalman-filtered naturally, to track the viewers. openFrameworks, OpenCV, and ROS. In short, everything you’d need to build a complex, human-interactive piece like this using completely open-source tools from beginning to end. Respect!

Continue reading “Texel: Art Tracks You, Tracks Time”

Creating Video Trails In OpenCV

The video trail effect is nothing new: it was first used in music videos like “Blame it on the boogie” from the Jackson 5 in 1978. Now,  [Antonio Ospite] has put together a nice article that shows the basics of using OpenCV to create this effect in live video. He used the open source video processing package OpenCV for this, creating the effect with a short script. It can run in multiple ways, creating video trail effects, or “catch-up”trails (where the trail reverses into a final frame).

This provides an interesting example of how these video effects have become so much easier to create. The Jackson 5 video was created using a Scanimate and Quantel Paintbox system that was as big as a closet and cost hundreds of thousands of dollars. Now, you can create these effects with free software and a cheap PC. Now you just need to figure out what in our modern world looks awesome with this throwback effect.

Continue reading “Creating Video Trails In OpenCV”

Getting Biometrics in Hand

It is amazing how quickly you get used to a car that starts as long as you have the key somewhere on your person. When you switch vehicles, it becomes a nuisance to fish the key out and insert it into the ignition. Biometrics aims to make it even easier. Why carry around a key (or an access card), if a computer can uniquely identify you?

[Alexis Ospitia] wanted to experiment with vein matching biometrics and had good results with a Raspberry Pi, a web cam, and a custom IR illumination system. Apparently, hemoglobin is a good IR reflector and the pattern of veins in your hand is as unique as other biometrics (like fingerprints, ear prints, and retina vein patterns). [Alexis’] post is in Spanish, but Google Translate does a fine job as soon as you realize that it thinks “fingerprint” is “footprint.” The software uses OpenCV, but we’ve seen the same thing done in MATLAB (see the video below).

Continue reading “Getting Biometrics in Hand”

Hackaday Prize Semifinalist: Picking Up Litter With Robots

On beaches, in parks, and in [BDM]’s back yard, there’s a lot of liter everywhere. The normal solution to this problem is to hire someone or find some volunteers to pick up all this trash. We’re living in the future, though, and that means robots. For his Hackaday Prize entry, [BDM] is building a robot that picks up trash.

A robot that picks up litter is a very, very interesting problem. It can’t be controlled by a person, or else it would be more efficient to just get out there and kill your back picking up bottles. This means it must work autonomously, and that means identifying litter, picking it up, and disposing of it.

For the identification part of the problem, [BDM] is using computer vision that captures an RGB image and discriminates against natural objects. Right now the computer vision is far from perfect, but it does a very good job, all things considering.

The next biggest problem is picking the trash up and disposing of it. For this, [BDM] has repurposed a Power Wheels and attached a DIY robot arm. It’s not a very powerful arm, and a children’s toy probably isn’t the best platform, but it is the start of something very, very cool.

You can check out [BDM]’s video for the project below.

The 2015 Hackaday Prize is sponsored by:

Continue reading “Hackaday Prize Semifinalist: Picking Up Litter With Robots”

Hackaday Links: August 16, 2015

[Matt] created an animated gif of New Horizon’s Pluto flyby. The source images were taken from the the raw LORRI images, modified so the background star field could be seen, and assembled with OpenCV. Because Pluto and Charon orbit each other around a point above Pluto’s surface, simply putting Pluto in the center of each frame wouldn’t work. It’s the best visual explanation of this weird arrangement yet, all brought to you by the magic of OpenCV and Python.

On the subject of Kickstarter creators that don’t understand the conservation of energy, I present this.

We don’t know exactly what’s going on with this one, but here’s a swimming pool covered with RGB LEDs. It’s controlled by two Rainbowduinos, and looks like the coolest disco floor you’ve ever seen.

[Frank]’s 2011 Hundai Santa Fe wasn’t cool enough, so he added an F16 flight stick to his shift knob. The choice of joystick is paramount here: Saitek joysticks look too techy, Logitech ones are too expensive, and the Warthog H.O.T.A.S costs $400. Joysticks are extremely niche peripherals these days, it seems. He ended up strapping an old F16 joystick from the 90s on his shift knob, and it looks close enough to the real thing.

Two bodgers are stuffing the engine from a Toyota Celica into a 1980 Mini, and they’re trying to make it look stock. We’ve seen their project before, and now there’s a new episode. In this episode: the pedal box, the steering wheel, and figuring out how to make the car drive straight.

Hackaday Prize Entry: An Open Source Industrial Camera

Over the last few years, connecting a camera to the Internet has gotten cheaper and cheaper. The advances that made this possible did not come through security cameras, but instead tiny cell phone camera modules, ARM boards, and embedded computing. Right now, if you want a livestream of your back yard, you’d probably get a Raspberry Pi and camera module. This will work for 90% of cases, but what if you want to livestream a slightly harsher environment? What if you want image processing right on the camera? What if you want this camera to have a rating for environmental protection?

[Apodiant]’s entry for the 2015 Hackaday Prize is solving the latter problem. It’s an Open Source Industrial Smart Camera with Ethernet, USB, and serial outputs, an ARM CPU for image processing, all tucked away in a sturdy aluminum enclosure.

The preliminary BOM for this camera is an iMX6 – a very capable microcontroller that can run Linux and OpenCV. The image sensor is a 1.2 megapixel unit [Apodiant] already has experience with, and the enclosure is an off the shelf deal for anyone who wants to build their own.


If this sort of setup sounds familiar, you’re right: there have been a few projects that have taken camera modules, added a powerful microcontroller, and run image processing on them. The latest in a long line of these projects is the OpenMV. That had a successful Kickstarter, and since [Apodiant] is going for the Hackaday Prize Best Product competition, it looks like a good fit.

The 2015 Hackaday Prize is sponsored by:

Googly Eyes Follow You Around the Room

If you’re looking to build the next creepy Halloween decoration or simply thinking about trying out OpenCV for the first time, this next project will have you covered. [Glen] made a pair of giant googly eyes that follow you around the room using some servos and some very powerful software.

The project was documented in three parts. In Part 1, [Glen] models and builds the eyes themselves, including installing the servo motors that will eventually move them around. The second part involves an Arduino and power supply that will control the servos, and the third part goes over using OpenCV to track faces.

This part of the project is arguably the most interesting if you’re new to OpenCV; [Glen] uses this software package to recognize different faces. From there, the computer picks out the most prominent face and sends commands to the Arduino to move the eyes to the appropriate position. The project goes into great detail, from Arduino code to installing Ubuntu to running OpenCV for the first time!

We’ve featured some of [Glen]’s projects before, like his FPGA-driven LED wall, and it’s good to see he’s still making great things!

Continue reading “Googly Eyes Follow You Around the Room”