It’s like the old quip from [Henry Ford]: You can have your 3D prints in any color you want, as long as it’s one. Some strides have been made to bringing more color to your extruded goodies, but for anything beyond a few colors, you’re going to need to look at post-print processing of some sort. For photorealistic 3D prints, you might want to look into a simple hydrographic printing method that can be performed right on a printer.
If some of the prints in the video below look familiar, it’s because we covered the original method when it was presented at SIGGRAPH 2015. [Amos Dudley] was intrigued enough by the method, which uses computational modeling of complex surfaces to compose a distorted image that will be stretched back into shape when the object is dipped, to contact the original authors for permission to use the software. He got a resounding, “Nope!” – it appears that the authors’ institution isn’t big into sharing information. So, [Amos] hacked the method.
In place of the original software, [Amos] used Blender to simulate the hydrographic film as a piece of cloth interacting with the 3D-printed surface. This allowed him to print an image on PVA film that will “un-distort” as the object is dipped. He built a simple tank with overflow for the printer bed, used the Z-axis to dip the print, and viola! Photo-realistic frogs and globes.
[Amos]’ method has its limitations, but the results are pretty satisfying already. With a little more tweaking, we’re sure he’ll get to the point that the original authors did, and without their help, thank you very much.
i have to get my hands on some hydrographic sheets, this would be an awesome method of adding detail onto objects, but i do think base colours might as well be painted in a more traditional way.
hand dipping seems viable if the tank dimensions are right, a jig probably wouldn’t hurt.
Aliexpress has heaps ready for your printer. I made suggestion to tip line a long while ago for Hackaday to sell them with the hackaday logo on – would be great to cover instrument boxes…
I’ve seen it done with spray paint on water too if you check out youtube
That’s pretty incredible! Definitely impressive!
From now on, this is how I’ll wash my cat: http://youtu.be/YlUhPrAqiY0#t=3m10s
I don’t know if enough realistic to recreate the staring effect http://vignette1.wikia.nocookie.net/southpark/images/5/57/206.jpg/revision/latest?cb=20160411023238
That was rather impressive. Thanks for the share. I wonder if you could do this with some sort of conductive ink and make 3D PCBs or sensor arrays. Would be sweet to add touch sensing to any common object….or maybe make some really interesting electromagnets for some unusual speakers.
Weird that the earlier work couldn’t be published. These are academic institutions, benefitting from taxpayer funds. I guess sometimes they see a chance to make a buck with some technology licensing and invest some of their own cash into research or what?
Not American taxpayer funds however- the original research was done by Zhejiang University (China).
It’ll probably be patented and sold off to some company, delaying when we’ll be able to use it generally. This is the hidden time cost with new developments most people don’t consider: once something is developed, you have to then wait for the patent to expire before you can use it yourself for anything that isn’t a trinket or two.
This doesn’t count for copyrighted software that can be re-implemented, but for processes and inventions that can be patented, it’s a real drag. More so if the patent ends up in the portfolio of a patent troll who sits on it until it expires. (meaning you can’t buy it from them and then mod it for your purposes, via the doctrine of first sale)
One of the big ones I’m rather depressed about will be the first good synthetic muscles (efficient electrical ones, not heated fishing line). While they’d be immensely useful to prosthetics, they’ll probably be patented 6 ways to sunday when they’re invented. Then anyone wanting to built their own prosthetics or robotics would have to buy enough of one of the products using them to strip & recycle the muscle fibres (the expensive, time consuming, and sub-optimal hacker route), else they either try to negotiate with the manufacturer for a custom part or try to purchase the rights to manufacture them themselves (the even more expensive route).
As long as it’s not already applied for, shouldn’t be patented now…
using blender and the properties of cloth simulations, brilliant!
This is great! It also has applications for vacuum forming. Ever since I saw this hackaday article, I wondered how it could be done: http://hackaday.com/2016/05/04/creating-full-color-images-on-thermoformed-parts/
Guess I’ll be building this soon… http://hackaday.com/2014/10/19/diy-vacuum-former-on-the-cheap/
It doesn’t seem a more practical solution than hand painting a model.
It scales up a lot better than hand painting. It makes more sense when you’re doing thousands rather than three.
(Or even just if you have shaky hands…)
Is it possible to contact Amos Dudley?
Hit his Twitter feed. He has a comment above with a link to the feed.
Hmmm… Perhaps a camera that looks straight down to position the X & Y alignment?
WOW, well done!