Here’s an art exhibit that does its own painting. The Senseless Drawing Bot (translated) uses the back and forth motion of the wheeled based to get a double-pendulum arm swinging. At the end of the out-of-control appendage, a can of spray paint is let loose. We’re kind of surprised by the results as they don’t look like a machine made them.
The video after the break gives a pretty good synopsis of how the robot performs its duties. The site linked above is a bit difficult to navigate, but if you start digging you’ll find a lot of build information. For instance, it looks like this was prototyped with a small RC car along with sticks of wood as the pendulums.
We can’t help but be reminded of this robot that balances an inverted double pendulum. We wonder if it could be hacked to purposefully draw graffiti that makes a bit more sense than what we see here?
[vimeo http://vimeo.com/30780208 w=470]
[Thanks Brian via IEEE Spectrum]
I disagree. It does look like a machine made them. The constraints on the top and bottom would not be hard had they been done by a person freehand.
Your arm also has constraints as to how far up or down it can go, no?
But the constraints are not as rigid. Posture and even walking would change the bounds of the constraints.
Well it looks a bit like the motion in cursive writing, mabye with some kind of position sensor and processing you could get it to write real words.
But then it would be a single arm CNC machine
Ya, but has anyone actually done that before? Simpsons? Arduino nuts? Anyone?
Didn’t think so, I think a one arm/axis CNC is a fascinating idea.
I thought double pendulums exhibit chaotic behaviour. Sure with identical start characteristics it would be initially repeatable, but presumably bearing wear in the linkages would cause differences to occur over time, but that would still require everything else to be the same – which it highly unlikely down to even the mm level
Couldn’t you use an accelerometer and feedback control to get a stable system? A cool example of stabilised triple pendulum using control theory: http://youtu.be/cyN-CRNrb3E
The video was edited with the same robot.
Senseless indeed…
Optical character recognition is good enough that I would consiter having the robot train itself to make shapes that it recognises as the desired letters. It would hopefully accept varied enough writing to to still appear freeform while being legible. While allowing the robot to change arm length might help make the top and botton constraints less obvious, among other possible improvements, I think it’ll also be less apparent if the density of lines is reduced, since individual arcs often don’t reach those limits.
In the main image used in this article, the right side of the squiggles seems to spell “Rao”, or perhapes “Ralo”.
Hmm pairing this robot….possibly an entire team of these robots….with observational rovers running Optical Character Recognition…
A modern twist on the Infinite Monkey Theorem?
@Techartisian
You have just made one hell of a point sir. I tip my hat to you and hope this morning’s coffee is a good one for your sake.
Wild Style!
Could be used by companies, municipalities to “dis” existing graffiti for a lower cost than having to repaint (and produce a fresh canvas for the taggers). Subway operators and railroads could place a stationary one along side their tracks and “redux” each car as it rolls past.
Looks like the crack addict grafitti all over town.