May 26 2015
Changxi Zheng, assistant professor of computer science at Columbia Engineering, along with China's Zhejiang University researchers, has created a method that enables hydrographic printing with exact alignment. Hydrographic printing is a well-known industrial process that transfers color ink on a thin film to the surface of 3D structures to precisely color them.
Computational Hydrographic Printing (SIGGRAPH 2015)
(Video Credit. Changxi Zheng)
The researchers have now created a new computational method to optimize the printing process, and created a model to predict distortion of color film during hydrographic immersion. Further, using this model, they have developed a colored film that ensures precise alignment of the surface textures.
“Attaining precise alignment of the color texture onto the surface of an object with a complex surface, whether it’s a motorcycle helmet or a 3D-printed gadget, has been almost impossible in hydrographic printing until now. By incorporating—for the first time—a computational model into the traditional hydrographic printing process, we’ve made it easy for anyone to physically decorate 3D surfaces with their own customized color textures,” said Zheng.
Hydrographic printing is a mass production technique used for transferring repeated color patterns on to a 3D object surface. It can be applied to any type of material, including porcelain, wood, plastic and metal. It makes use of a PVA film with printed color patterns which is placed on water.
After the application of an activator chemical over the film, the film gets softened and easily stretched. Following this, a physical object is gradually immersed into the water via the floating film. The film gets stretched when it touches the object, thereby covering the object’s surface and sticking to it.
As a result, the color ink over the PVA film is coated on the surface. However, one of the major limitations of the process is the inability to exactly align a color pattern over the object surface upon stretching the color film. The film may be stretched severely in the case of complex surfaces, tearing the film apart.
“So current hydrographic printing has been limited to transferring repetitive color patterns. But there are many times when a user would like to color the surface of an object with particular color patterns, to decorate a 3D-printed mug with specific, personalized images or just to color a toy,” Zheng explained.
As a continuation of previous research on fluid and viscous sheet simulation carried out at Columbia Computer Graphics Group, Zheng has created a new viscous sheet simulation method used for modeling the color film stretch at the time of hydrographic printing. The model can identify the stretch and distortion of color films and generate a map between the surface locations and film locations. Using the map, the color image can be computed for printing on the PVA film, which, after the hydrographic immersion, forms the desired color pattern on the surface.
Zheng and his colleagues employed off-the-shelf hardware to create a calibrated system and prove the efficiency of this simulation. A mechanical apparatus within a calibrated system can be used to accurately tweak the object immersion process, and a 3D-system is used to measure the object orientation and its dipping location.
Further, a color image was computed to feed into the hydrographic system following the incorporation of Zheng’s simulation model for accurate texture registration. In order to avoid severe film distortion and film tearing, a multi-immersion design is used where the object can be immersed a number of times with a different orientation and using a film with a different color pattern. The computation of color patterns also combines the transferred colors from individual immersions with the desired “final” surface decoration.
“This system is easy to set up for personal use and it’s quite inexpensive, less than 40 U.S. cents per printing. And it works for a wide range of complex surface geometries and materials,” Zheng noted.
He added, “This was a challenging but fun project. I'm very interested in the interaction between the virtual world and the real world, and this research is a good example of how virtual-world computation can work hand-in-hand with a real-world manufacturing process and significantly improve the production quality.”
The study will be showcased at SIGGRAPH 2015 taking place at Los Angeles between August 9 and 13. The research was partly supported by the National Science Foundation in U.S. and in China, in addition to gifts from Intel.