Decorate Your 3D Prints With Detailed Hydrographic Printing

It’s like the old quip from [Henry Ford]: You can have your 3D prints in any color you want, as long as it’s one. Some strides have been made to bringing more color to your extruded goodies, but for anything beyond a few colors, you’re going to need to look at post-print processing of some sort. For photorealistic 3D prints, you might want to look into a simple hydrographic printing method that can be performed right on a printer.

If some of the prints in the video below look familiar, it’s because we covered the original method when it was presented at SIGGRAPH 2015. [Amos Dudley] was intrigued enough by the method, which uses computational modeling of complex surfaces to compose a distorted image that will be stretched back into shape when the object is dipped, to contact the original authors for permission to use the software. He got a resounding, “Nope!” – it appears that the authors’ institution isn’t big into sharing information. So, [Amos] hacked the method.

In place of the original software, [Amos] used Blender to simulate the hydrographic film as a piece of cloth interacting with the 3D-printed surface. This allowed him to print an image on PVA film that will “un-distort” as the object is dipped. He built a simple tank with overflow for the printer bed, used the Z-axis to dip the print, and viola! Photo-realistic frogs and globes.

[Amos]’ method has its limitations, but the results are pretty satisfying already. With a little more tweaking, we’re sure he’ll get to the point that the original authors did, and without their help, thank you very much.


19 thoughts on “Decorate Your 3D Prints With Detailed Hydrographic Printing

  1. i have to get my hands on some hydrographic sheets, this would be an awesome method of adding detail onto objects, but i do think base colours might as well be painted in a more traditional way.
    hand dipping seems viable if the tank dimensions are right, a jig probably wouldn’t hurt.

  2. That was rather impressive. Thanks for the share. I wonder if you could do this with some sort of conductive ink and make 3D PCBs or sensor arrays. Would be sweet to add touch sensing to any common object….or maybe make some really interesting electromagnets for some unusual speakers.

  3. Weird that the earlier work couldn’t be published. These are academic institutions, benefitting from taxpayer funds. I guess sometimes they see a chance to make a buck with some technology licensing and invest some of their own cash into research or what?

    1. It’ll probably be patented and sold off to some company, delaying when we’ll be able to use it generally. This is the hidden time cost with new developments most people don’t consider: once something is developed, you have to then wait for the patent to expire before you can use it yourself for anything that isn’t a trinket or two.

      This doesn’t count for copyrighted software that can be re-implemented, but for processes and inventions that can be patented, it’s a real drag. More so if the patent ends up in the portfolio of a patent troll who sits on it until it expires. (meaning you can’t buy it from them and then mod it for your purposes, via the doctrine of first sale)

      One of the big ones I’m rather depressed about will be the first good synthetic muscles (efficient electrical ones, not heated fishing line). While they’d be immensely useful to prosthetics, they’ll probably be patented 6 ways to sunday when they’re invented. Then anyone wanting to built their own prosthetics or robotics would have to buy enough of one of the products using them to strip & recycle the muscle fibres (the expensive, time consuming, and sub-optimal hacker route), else they either try to negotiate with the manufacturer for a custom part or try to purchase the rights to manufacture them themselves (the even more expensive route).

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.