Materials characterization, failure analysis and service laboratories are under increasing pressure to provide robust, reliable, and statistically relevant results while offering higher throughput and maximum process efficiencies.
Automation offers a means of addressing these pressures and challenges. It provides a reliable means of optimizing instrument use, improves the accessibility of instruments, and can reduce the barriers to entry in terms of experience and specialist knowledge.
One of the most important benefits of automation in today’s results-driven, highly competitive application environments, is the ability to process more samples and conduct more measurements to improve data density and therefore the overall quality of the data.
More results per sample, or more samples per batch are crucial to improving overall statistical reliability and affording manufacturers, and quality assurance teams increased confidence in their measurements. The ability to acquire more data is also essential for fundamental materials research and development and failure analysis, where increases in available data can be central to achieving vital insights into the performance or failure of a component or product.
This article explores the benefits of automation by using several specific materials characterization applications. It provides practical examples of how we have implemeted automation of these processes and outlines some of the tools and approaches which laboratories can employ in automating their own imaging processes, helping to streamline their instruments’ workflows.
The article also looks at how automation tools can provide users with added insights into their processes, ultimately empowering them to evaluate, refine and customize these to meet their specific application needs.
Image Credit: Automation can greatly accelerate nanoparticle characterization. Here, a grid covered with mixed nanoparticles (Si, Ni) was chemically analyzed using a Thermo Scientific™ Talos™ F200X G2 TEM with a Thermo Scientific Super-X™ G2 Detector and the Automated Nanoparticle Workflow (APW). An area with a total width of 1.01mm was analyzed by acquiring 53 × 16 (848) tiles with a total of 138M+ pixels. Overall, 3,937 nickel and 4,546 silicon nanoparticles were analyzed. Total acquisition time was 18 hours and was performed fully unattended using Thermo Scientific Maps and Velox™ Software. Processing was done on the fly with Thermo Scientific Avizo2D™ Software during tile acquisition.
Analyzed on a Thermo Scientific Talos F200X G2 with Super-XTM G2 and automated nanoparticle workflow pack (APW)
Productivity, Performance, and Efficiency
The most widely recognized and common outcome of automation is improved process efficiency.
Manual processes, irrespective of the application or analysis type in question, are typically more time-consuming than their automated counterparts.
For example, SEM technology has evolved a great deal since its inception. It is no longer always necessary to manually adjust or align the column prior to acquiring an image, but the process of acquiring an image is often performed manually by an operator.
This issue is compounded by the requirements of many quality assurance processes that require multiple images of the same sample to be collected, often from randomized locations, to gain a truly representative, statistically robust set of data.
Each of these images may require its brightness, contrast, or other characteristics to be adjusted before analysis can occur, and images must be stored in a coherent, logical manner for later retrieval or further processing.
Automating this process through the use of scripting, a proprietary or third-party instrument interface, or a software tool can transform this labor-intensive, time-consuming process into one where an operator can simply run a recipe. Allowing operators to step away from the instrument to focus on other tasks and then return later to collect the results or move onto the next stage of the experiment.
Improving Instrument Accessibility
A key benefit of automation is its potential to improve the accessibility of instruments. ‘accessibility’ in this context carries several different meanings.
For example, some advanced instruments feature automated data acquisition processes that can be triggered remotely, allowing operators to begin a task while physically distanced from an instrument.
This capability has been key to many laboratories in allowing them to continue operating within the ‘new normal’ of remote working and social distancing that many industries have found themselves working within.
Once the data acquisition or analysis task has been completed, operators can retrieve results via online portals, cloud-based databases, or remote mirrored versions of instrument interfaces.
Automation has the potential to lower the barrier to use for several instruments. With complicated, multi-stage experimental and analysis procedures accessible via a user-friendly interface within a few clicks, it is possible to quickly acclimatize new operators to a laboratories’ most advanced data acquisition processes.
It is also possible to ensure that new processes are up and running rapidly by leveraging automation tools such as calibration wizards.
In sectors where laboratories must be responsive and adaptable, the ability to quickly implement new analysis processes and leverage consistently optimized instruments is key.
Image Credit: Aluminum sample, where a 5x6 array of S/TEM lamellas has been prepared with AutoTEM software fully unattended, undercut and ready for lift-out in 6 hours. The software allows automatically defining the array and shifting individual locations, to more precisely position the lamella.
Consistent Processes Yield Confidence in Results
Human error is widely regarded as a key factor in diminishing the reliability of data and negatively impacting the repeatability of results, with even the most experienced operators likely introducing a degree of bias when processing tens or hundreds of samples.
Automating processes and minimizing user intervention make it possible to improve repeatability and reliability vastly.
Precise results require optimized, consistent experimental procedures and analysis techniques. The use of automated recipes with carefully planned, certified, and industry-standard steps can ensure that results are consistent between batches of samples and between laboratories or between different manufacturing sites.
Applications where manual analysis may be subjective, for example, image analysis of TEM samples or the manual interpretation of XPS spectra, can benefit significantly from the use of software-based algorithms that can objectively analyze, label, and categorize images based on predefined factors.
This degree of consistency is key to many research and development and quality assurance processes. This is particularly important when seeking to analyze components or products in line with ISO, ASTM standards.
These regulated standards require transparent, demonstrable approaches that precisely adhere to prescribed steps throughout every analysis. In research and development applications, consistency is essential in ensuring confident, repeatable results that can be used as the basis for future development and decision-making.
Customizing Automation Recipes with Scripting
Users wanting to move beyond commercially available automation tools and those built into their instruments can benefit from using an array of scripting tools, depending on their specific equipment.
Scripting offers users a direct, programmable interface to control their instruments’ hardware and mode of operation. Most instruments will use either a standard or proprietary scripting language.
For example, the Thermo Scientific PhenomTM desktop SEM employs the Phenom Program Interface (PPI) platform, which can be programmed via the popular languages Python or C#, depending on the user’s preference and expertise.
A script can be understood as an automation recipe – a series of commands that users create (either themselves or using existing modules or open-source tools) to perform a specific task or series of tasks.
For example, a user could write a script to program the Phenom SEM to acquire a series of 100 images of a sample from different, randomized locations using a series of scripted commands, a random number generator, and a loop function.
Using automation tools or third-party applications, or even taking the time to work with scripting languages like Python or C# can significantly improve the level of control and process customization available to any process.
Leveraging Artificial Intelligence
Artificial intelligence represents the next shift towards increasingly automated, intelligent processes.
Many artificial intelligence-based applications are based on the concept of the ‘neural network’ or ‘machine learning’, tools that have been ‘trained’ by the developer or user to make decisions based on specific inputs.
For example, a neural network may be trained to analyze grayscale SEM images to detect particles with a specific orientation or composition, highlighting these in the image to enable more straightforward quantification or further analysis. The neural network will apply a number of ‘filters’ to the image based on what it understands the particles of interest to look like, applying these filters until it has completed its analysis.
As well as offering a more objective analysis with no notable operator bias and repeatable results, the use of artificial intelligence as a means of advanced automation allows processes and analyses to be completed in vastly shorter timescales than would be possible manually.
Image Credit: The Thermo Scientific™ Spectra™ 200 scanning transmission electron microscope (S/TEM) is the high-throughput, aberration-corrected, scanning transmission electron microscope for all materials science applications.
More Than Just a Black Box
Automation should never be implemented for automation’s sake, and it is important to consider the tangible benefits of automating a process before committing to this.
Much of the criticism of automation stems from the misguided belief that it deskills users by obscuring the inner workings of a process and removing the need to learn its fundamentals. When implemented in the right way, however, automation can empower and educate users.
For example, completing an advanced task using a wizard interface that displays each individual step on screen and keeps the user informed at every stage will allow a novice user to complete an advanced task while gaining valuable insight into the process.
In instances where users spend the time using scripting tools or neural networks to devise their own automated processes, they are prompted to consider their approach and methodology in more detail than perhaps ever before.
While it should be as user-friendly as possible, a well-designed automated system should offer more than just simple push-button operation. Users must be provided the option to customize parameters at each stage of the process, and they must receive clear, timely feedback throughout to enable troubleshooting and customization as required.
Thermo Fisher Scientific instruments fulfill these requirements, empowering users to complete complex tasks, helping to alleviate backlogs, and minimizing the risk of bottlenecks. Automated recipes can also provide a valuable foundation for the development of more sophisticated, customized methods.
Overall, automation improves productivity by optimizing the utilization of the instruments in a lab, increasing engagement in these processes, and minimizing idle time. Automation makes complex processes more accessible, allowing labs to be flexible and adaptable enough to accommodate complex, shifting analysis requirements in ever-evolving markets and sectors. Automation is also a key tool in ensuring confidence in results, helping to remove human error from processes and ensuring that experiments and analyses are high-throughput, repeatable and consistently performed.
Thermo Fisher Scientific has extensive experience in supporting customers in automating and streamlining their workflows across research and development, quality control, failure analysis, and related tasks. The company’s team is on hand to provide more information and product demos as required.
About Richard White
In his 25 years with Thermo Fisher Scientific, Richard has held diverse roles covering applications, product development, marketing and commercial management. During this time his focus has primarily been on the company’s surface science, microscopy and microanalysis portfolios for materials characterisation.
Richard attained his PhD in Surface Physics from the University of Liverpool in the UK. His studies were related to the study of the electronic and magnetic structure of ultra thin film rare earth films using synchrotron radiation.
About Thermo Fisher Scientific
Thermo Fisher Scientific is the world leader in serving science. Through our Thermo Scientific™ brand, we provide innovative solutions for electron microscopy and microanalysis. We offer SEMs, TEMs and DualBeam™ FIB/SEMs in combination with software suites to take customers from questions to actionable data by combining high-resolution imaging with physical, elemental, chemical and electrical analysis across scales and modes - through the broadest sample types.