Simulated annealing

annealing

Here’s an update on our earlier post about genetic programming. Altered Qualia has posted a new implementation of [Ron Alsing]‘s idea. It starts with 50 polygons and then randomly changes one parameter with each optimization step. If the the change results in fewer differences from the target image, it’s kept as the new best DNA. This search method is similar to simulated annealing. The image above is the result of 1500 good mutations out of 35900 possible. The implementation lets you choose any image, but smaller means the fitness calculation is faster. It’s written in JavaScript using the <canvas> environment. You’ll definitely get better performance using newer browser builds.

[Original image by R Stevens]

[via Waxy]

24 thoughts on “Simulated annealing

  1. None of the “Genetic” or “Simulated annealing” algorithms have anything to do with what they claim. Neither of them are any more than naive hill climbing ( http://en.wikipedia.org/wiki/Hill_climbing ).

    Seriously, this is about the worst method of doing this. The least you could do is add some momentum to the simulation to speed it up and avoid local maxima.

  2. I have to admit, these things are at least addicting to watch.

    I took this chance to upgrade to the latest build of firefox. This performance in this script was noticeably faster. Cool.

  3. This sounds like a bog standard ‘greedy’ hill climbing algorithm to me, which has nothing to do with simulated annealing.

    Simulated annealing would involve a probability of choosing an image which is -less- like the target image. This prevents the algorithm getting stuck in dead-ends (local maxima).

  4. I think this is cool. Efficient LED blinkers get old after a while..just kidding.

    I seen a project years back using genetic programming that generated vectorized c++, compiled it, analyzed it’s efficiency, and generated a new algorithm until it couldn’t get anymore efficient.

  5. How is this best fit? Clearly, looking at the image there is less than 50 polygons. I’m not even sure how this is genetic programming. What DNA do you start out with? If it is random then how do you decide what is possible? How does the program know which is the best match? If you are doing this by eye then really your program is not truly genetic. If you keep randomly changing points of the polygon eventually you will end up with something similar. How do you decide the coloring or is that random too? Why doesn’t the program learn and evolve leaving out any interaction other than the input of an image? Then spit out the final best fit. The term genetic programming is used quite loosely here. What is the environment that the program is in? How does it evolve? This is basically randomizing polygons until you get a best fit image by eye.

  6. Max, RTFA. It starts with 50 randomly placed polygons, all of them invisible. That’s why you don’t see them all. Best match is determined by a pixel-by-pixel comparison of the test image with the original image. The more color difference there is, the larger the difference. Differences are summed and a smaller total is better. A mutation is a random change to either one pixel or the color of a single polygon. As to whether it’s GP or not, you’re entitled to your own opinion. The author says: “I will claim that this is a GP due to the fact that the application clones and mutates an executable Abstract Syntax Tree (AST). Even if the population is small, there is still competition between the parent and the child, the best fit of the two will survive.”

  7. icept is 100% correct: this is Hill Climbing, not Simulated Annealing. It’s also completely unrelated to Genetic Algorithms.

    I expect crappy science reporting from other places, not Hack a Day. Eliot, you should really know better.

  8. phoenix, based on the link you provided, hill climbing “…starts with a random (potentially poor) solution, and iteratively makes small changes to the solution, each time improving it a little…” This implementation does not do that. It makes a random change, checks to see if it’s more fit, and then either keeps or discards the change. That’s a mutation and a fitness function and that puts it in the realm of GP.

  9. In my response to max, that should read “A mutation is a random change to either one _vertex_ or the color of a single polygon…”

  10. Dan, so what you’re saying is the image is ideal DNA for the environment. The program searches or mutates until a DNA is best fitted based on parameters like polygons and iterations. Fair enough, I still don’t think it’s GP. Oh well, semi-interesting anyways.

  11. Man, how the hell can anyone claim that blurring
    up a picture to the point that you can’t see any
    details, qualifies as “BEST” ???? WTF !??? did
    someone reverse the captions ? If I want to screw
    up my focus algorithims for a new digital camera,
    I’ll just smear up the lens with grease! we don’t
    need no stinkin “annealing” BS to hose up a picture!

  12. This stuff is ubercool — especially when coded in client-side javascript.

    Is it good for anything useful? That’s a bit like asking if science is good for anything useful. If you’re not sure about the answer to that question then I don’t know what to tell you. The example presented here is geared toward a pretty specific problem but, then again, so is nearly every post that has to do with a specific piece of equipment/device/programming framework/whatever. Stretch your imagination a bit (or just use google) and I’m sure you’ll find plenty of applications for this sort of thing.

  13. I wrote my own GA version and actually, my results were very similar. The problem is that there is a lot of interdependences between the genes/triangles especially when you have transparency involved. So you need to have, multiple changes in color and point location to have an improvement that is propagated. Which doing a single mutation at a time won’t allow.

    In summary its easy to get into local minimum and not reach the global ones.

    I think the first one did such a good job because his mutations did a much better job of getting of local minimum.

  14. I have to agree: this is just a hill climbing algorithm, aka “I failed algorithms 101″. This really has no relation to genetic algorithms.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s