Posted in | News | 3D Printing

New Method Provides Confidence in Detection of Flaws in 3D-Printed Metal Parts

Researchers at the Department of Energy's Oak Ridge National Laboratory have improved flaw detection to increase confidence in metal parts that are 3D-printed using laser powder bed fusion. This type of additive manufacturing offers the energy, aerospace, nuclear and defense industries the ability to create highly specialized parts with complex shapes from a broad range of materials. However, the technology isn't more widely used because it's challenging to check the product thoroughly and accurately; conventional inspection methods may not find flaws embedded deep in the layers of a printed part.

ORNL researchers developed a method that combines inspection of the printed part after it is built with information collected from sensors during the printing process. The combined data then teaches a machine-learning algorithm to identify flaws in the product. Even more significantly, this framework allows operators to know the probability of accurate flaw detection just as reliably as traditional evaluation methods that demand more time and labor.

"We can detect flaw sizes of about half a millimeter - about the thickness of a business card – 90% of the time," said ORNL researcher Luke Scime. "We're the first to put a number value on the level of confidence possible for in situ (in process) flaw detection." By extension, that reflects confidence in the product's safety and reliability.

Laser powder bed fusion, the most common metal 3D-printing process, uses a high-energy laser to selectively melt metal powder that has been spread across a build plate. Then the build plate lowers before the system spreads and melts another layer, slowly building up the designed product.

Engineers know there will be flaws in the material.

"For regular manufacturing we know what those are and where and how to find them," said ORNL researcher Zackary Snow. "(Operators) know the probability that they can detect flaws of a certain size, so they know how often to inspect to get a representative sample."

3D printing has not benefited from the same confidence.

"Not having a number makes it hard to qualify and certify parts," Snow said. "It's one of the biggest hurdles in additive manufacturing."

A paper by ORNL researchers and partner RTX, recently published in Additive Manufacturing, explains the process they developed to arrive at a 90% detection rate while reducing the probability of false positives, which can lead to scrapping good products.

For the first research step, aerospace and defense company RTX designed a part similar to one it already produces, providing opportunities to see realistic-looking flaws. Then, RTX 3D-printed the part multiple times monitoring during the build with a standard near-infrared camera and an added visible-light camera. Both RTX and ORNL researchers conducted quality inspections afterward using X-ray computed tomography, commonly called CT scans.

With consultation from RTX, ORNL additive manufacturing experts aligned the data into a layered stack of images, which essentially became the textbook for the machine-learning algorithm. During training, the algorithm took a first pass at labeling flaws using the CT scan images. Then a human operator annotated the rest based on visual cues in data collected during the printing process. Human feedback continues to train the software, so the algorithm recognizes flaws more accurately each time. Previous ORNL advances in in-situ monitoring and deep-learning frameworks were used as tools in this novel approach. Over time, this will reduce the need for human involvement in manufacturing inspection.

"This allows CT-level confidence without CT," Snow said. A common method for checking some 3D-printed parts, CT imaging and analysis drives up costs because it requires extra time and expertise. Plus, CT cannot effectively penetrate dense metals, limiting its application.

When the algorithm is applied to a single design consistently manufactured with the same material and process, it can learn to provide consistent quality analysis within days, Scime said. At the same time, the software incorporates all that it learns from different designs and constructions, so it will eventually be able to accurately check products with unfamiliar designs.

The ORNL-developed inspection framework could help in expanding additive manufacturing applications. With statistically verified quality control, additive manufacturing could become viable for mass-producing products like car parts, Snow said.

It could also diversify the types of parts that can be 3D printed. Certainty about the smallest detectable flaw size allows more design freedom. This is important because the industry is already headed toward larger print volumes and faster print rates – meaning more lasers interacting to create bigger parts with different types of flaws, said Brian Fisher, senior principal engineer for additive manufacturing at RTX's Raytheon Technologies Research Center.

"You can really start to save time and money and make a business case when printing larger assemblies – except those are also the hardest to inspect today," Fisher said. "The vision is with additive, we can make large, highly-complex parts in very dense materials, which traditionally would be very difficult and expensive to thoroughly inspect."

Next, the ORNL team will train the deep-learning algorithm to better differentiate between types of irregularities and to categorize flaws that have multiple causes.

Additional ORNL researchers who contributed to the project also include Amir Ziabari and Vincent Paquit. The research was sponsored by DOE's Advanced Materials and Manufacturing Technologies Office, or AMMTO, with support from RTX and took place at DOE's Manufacturing Demonstration Facility at ORNL.

UT-Battelle manages ORNL for the Department of Energy's Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.