Get Digital Plastic Surgery Thanks To OpenFrameworks And Some Addons

[Kyle McDonald] is trying out a new look, at least in the digital world, with the help of some openFrameworks video plugins. He’s working with [Arturo Castro] to make real-time facial substitution as realistic as possible. You can see that [Arturo’s] own video has a different take on shading and color of the facial alterations that makes them a bit less realistic than what [Kyle] was able to accomplish (see that clip after the break).

The setup depends on some facial tracking software developed by [Jason Saragih]. That package is wrapped in ofxFaceTracker (already linked at the top of this article) which makes it play nicely with openFrameworks. From there, it’s just a matter of image processing. If you think you’re up to the challenge, grab your own copies of the source code and get to work. We’re shocked by how real this looks, even when [Kyle] grabs his cheeks and stretches them out. If someone can fix some of the artifacts around the edges of the sampled faces this would be ready to use when video-conferencing.

It kind of makes us think of technology seen in The Running Man.

[vimeo http://vimeo.com/29348533 w=470]

[Thanks Luke]

27 thoughts on “Get Digital Plastic Surgery Thanks To OpenFrameworks And Some Addons

  1. I think this is amazing. This is step 2 in the direction of living breathing real life avatars. Just imagine walking down the street with some futuristic glasses, and seeing everyone how they want to be seen. Expand this to full body, get some brand backing. Make an app store with different clothing choices. Thats where I want to see us in 10 years. I like where your heads at. (I’m so punny)

  2. Better AAM based facial trackers have been around for over 5 years. (GPL too)

    The kid didn’t cite the 10 other works related to this method or post his code for peer review.

    Science fail…

    1. Because he was making working software not a scientific paper.
      If coders cited all work in all videos relate to any of their code it would be a mile long. The video libs, the windows driver libs, the 3d algorthms used to project the vertexs from 3d space to 2D, the shading system, probably the compression on the textue formats….it would go on and on.

      We all work on the shoulders of giants as it were.

      oh, and if you think you can make a better one, go ahead.

  3. “Just imagine walking down the street with some futuristic glasses, and seeing everyone how they want to be seen”

    Second.
    AR+This tech will be hell cool. Forget fashion, we would have full body avatars that change when we want.
    I’m sure on the one hand people will say this will make us live too much in a fantasy world.

    But, on the other, it would probably make us less materialistic. If anyone or anything can look like anything else for no cost…whats the value of image anymore?

    Wonderfull software either way.

  4. Pretty darn awesome, and I attribute the artifacts to imperfect initial images, mostly. Like, for example, obama’s smile is very wrinkly, and it transferred the wrinkles very heavily onto kyle’s face.

    Glad to see the Apple issue didn’t stop you from hacking some more!

  5. Ok, I found the problem I think. Almost all of the files are available on github but there’s a folder that contains Jason Saragihn Tracker that has some necessary files that aren’t included.

    HAD was mistaken in saying it was included. I requested the files from Saragihn and I’ll see if that was the problem when I get them.

Leave a Reply to ZigCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.