Play Music with your Painting Using Teensy

[sab-art], a collaboration between [Sophia Brueckner] and [Eric Rosenbaum], has created a touch-sensitive musical painting. Initially, basic acrylic paint is used for the majority of the canvas. Once that is dry, conductive paint is used to make the shapes that will be used for the capacitive touch sensing. As an added step to increase the robustness, nails are hammered through each painted shape and connected with wiring in the back of the painting. These wires are then connected to the inputs of a Teensy++ 2.0, using Arduino code based on MaKey MaKey to output MIDI. The MIDI is then sent to a Mac Mini which then synthesizes the sound using Ableton Live.  Any MIDI-processing software would work, though. For this particular painting, external speakers are used, but incorporating speakers into your own composition is certainly possible.

A nice aspect of this project is that it can be as simple or as complex as you choose. Multiple conductive shapes can be connected through the back to the same Teensy input so that they play the same sound. While [sab-art] went with a more abstract look, this can be used with any style. Imagine taking a painting of Dogs Playing Poker and having each dog bark in its respective breed’s manner when you touch it, or having spaceships make “pew pew” noises. For a truly meta moment, an interactive MIDI painting of a MIDI keyboard would be sublime. [sab-art] is refining the process with each new painting, so even more imaginative musical works of art are on the horizon. We can’t wait to see and hear them!

Continue reading “Play Music with your Painting Using Teensy”

Robot Painter Works Like a Photobooth

robot-painter-photo-booth

[Ben], [David], [Drew], [Kayla], and [Peter] built a robotic artist as their senior design project. This mashes up a bunch of different project ideas, but the thing we like the most about it is that it works much like a photo booth that produces a painting. A Raspberry Pi uses a webcam to snap the picture, converts the image to three colors (plus the white background of the canvas) and sets the robot in motion. The team laments that initial testing of the completed project (seen in the clip below) worked out quite well but took hours to produce the painting. What do they expect? It’s art!

This is quite a bit different from the WaterColorBot (whose manufacturing process we just looked in on yesterday). WaterColorBot uses a flat canvas and a gantry system. This offering, which is called PICASSAU, uses an upright canvas with the paintbrush mounted in much the same way as a plotter robot. The biggest difference is that there is the ability to pivot the paint brush in order to pick up more paint, and for cleaning in between color changes.

Continue reading “Robot Painter Works Like a Photobooth”

Priceless Paintings – Scanned and Printed in 3D

painting

When we think of works by Van Gogh and Rembrandt, most of us remember a picture, but we aren’t accustomed to seeing the actual painting. [Tim Zaman], a scientist at Delft University of Technology in the Netherlands, realized that the material presence of the paint conveys meaning as well. He wanted to create a lifelike reproduction in full dimension and color. While a common laser-based technique could have been used for depth mapping, resolution is dependent on the width of the line or dot, and the camera cannot capture color data simultaneously with this method. In his thesis, [Tim] goes into great detail on a hybrid imaging technique involving two cameras and a projector. He and his team eventually used two 40-megapixel Nikon cameras in conjunction with a fringe projector to capture a topographical map with in-plane resolution of  50 μm, and depth resolution of 9.2 μm.

We can’t find a lot of information on the printing process they used, other than references to high-resolution 3D printers by Océ (a Canon company). That said, [Tim] has provided a plethora of images of some of the reproductions, and we have to say they look amazing. The inclusion of depth information takes this a big step further than that gigapixel scanning setup we saw recently.

Check out the BBC interview with Tim, as well as time lapse videos of the scanning and printing process after the break.

Continue reading “Priceless Paintings – Scanned and Printed in 3D”

Electric paint brush loads itself with paint

Meet [Jahangir Ahmad]. He’s a 19-year-old from India who recently won third place in a contest put on by the National Innovation Foundation. Here he’s posing with the electric paint brush which he developed after seeing some local painters struggling with brushes and buckets at the top of a ladder.

His system uses a 1 hp motor to pump paint from the bucket directly into the brush. Once it enters the handle a distributor splits the flow into four parts so that it reaches the bristles evenly. The pump of the paint is actuated by a controller which can be worn on the painter’s belt. When you get a little low on paint, just hit the button and you’ll get boost. Since the base of the bristles is meant to hold a small reservoir of paint, this has the potential to be better than dipping in a bucket.

[via Reddit via Home Harmonizing via Damn Geeky]

[Jackson Pollock] is now a robot

Even though abstract expressionism died out several decades ago, robots are still chugging along dripping nihilistic pigment onto a cold, uncaring canvas. [Liat] and [Assaf] created a robot named The Originals Factory to create paintings in the style of abstract expressionism, a style of painting that is arguably best represented by [Jackson Pollock] and his ‘drip paintings.’

The build is surprisingly simple – there are four containers filled with C,M,Y, and K pigments. Pumps transport these paints to a print head mounted on an aluminum rail above a canvas. The software portion of the build is rather interesting. Instead of pixels, the image is rendered in ‘vixels’ – vertical lines of a specific length and color. Although we don’t see any examples of more precise work, [Liat] tells us The Originals Factory can be used to plot graphs on the canvas.

Check out a video of The Originals Factory squirting paint down a canvas after the break.

Continue reading “[Jackson Pollock] is now a robot”

Hackaday Links: November 24, 2011

Finally an Arduino shield that does nothing

The folks at Evil Mad Scientist labs have finally created the Googly Eye Shield for Arduinos. With it’s pass-through .100 headers, it adds googly eyes to your Arduino projects. Of course, instead of in addition to the googly eyes you could add a breadboard, making it somewhat useful. A million fake internet points goes to the first person to implement Xeyes on this thing.

Phat beats from kids toys

[Ville] couldn’t afford an Akai MPC for laying down some beats. Wanting a real tactile interface, he hacked this kid’s toy. It’s just an RCA cable attached to the tiny chip inside the toy. The new line out goes to his mixers where he does some pretty impressive stuff.

Mona Lisa is Vigo the Carpathian

What did we just say about real-life Xeyes? [Geert] just made a print of the Mona Lisa follow you around the room with her eyes (Dutch, translation). The build is a pair of servos and a DIY motion capture app running on a laptop. Now we need to find a print of Vigo…

Quantifying heat sink efficiencies

[Mike] is an experimenter at heart. He was wondering about the efficiency of small, clip-on heat sinks versus the ones we use to defrost frozen food. The results are exactly as you would expect, but he did find something interesting – his experimental technique didn’t find much of a difference between thermal paste/grease/pads and no thermally conductive material.

Mini-fig sized R/C LEGO car

The guys at Brickmodder.net took a car from a LEGO set and made it remote control. The drive train and steering both use servos controlled by the smallest 3-channel receiver they could find.

[Vigo’s] stare follows you wherever you go

To decorate the office for Halloween [Eric] decided to make [Vigo the Carpathian] stare at passersby. We hope that readers recognize this image, but for those younger hackers who don’t, this painting of [Vigo] played an important part in the classic film Ghostbusters II.

In the movie, his eyes appeared to be following anyone looking at the painting. [Eric] grabbed a Kinect and used Processing to recreate the effect in real life. The image is displayed on an LCD screen. A bit of work with Photoshop allowed him to cut out the eyes from the image, then create sprites which are moved by the Processing sketch. It’s reading data from the Kinect (so it knows where to ‘look’) which you can see perched on the top of the cubicle wall. The illusion is delightful, see for yourself in the clip after the break. We’ve already watched it a half-dozen times, and it looks like it was a real hit with the guests at the open house.

Can you believe they threw this together in just one day?

Continue reading “[Vigo’s] stare follows you wherever you go”