Interview conducted by Alex SmithMar 24 2023
In this interview, AZoM talks to Simon Taylor from Mettler Toledo’s Titration product group about data integrity in titration and why it is important to do so for laboratories, production lines or wherever water determinations are performed.
Specifically, the interview will delve into explaining the goal of data integrity, then move on to some definitions of Data Quality and Integrity. Then we will talk about which data is relevant and how they should be generated, saved and used.
The interview will also highlight steps in the workflow that clients should consider, particularly in relation to KF titrations. Finally, this piece will highlight some solutions that Mettler Toledo offers to help clients achieve their data integrity goals.
What is the goal of data integrity?
The goal of data integrity is to reduce the number of errors and move towards an automated, transcription-free data flow. The aim is to reduce the number of errors by moving away from manual transcription – that is, transcriptions from either hybrid, paper or manual systems.
Hybrid systems are typically combinations of computerized systems and paper records. For example, a hybrid system can be a computerized system in which there is a combination of original electronic records and paper records that comprise the total recordset that should be reviewed and retained.
What is the difference between data integrity and data quality?
In short, data integrity is the extent to which all data is complete, consistent and accurate throughout the data lifecycle.
Data integrity ensures that data is transcribed without errors and that it cannot be manipulated by a proper set-up of data flow, the integration of data and metadata, proper archiving and ensuring data accessibility.
Data quality ensures data is generated without errors by using proper, calibrated equipment, following SOPs, identifying and training users, and using the right materials. Systems should be designed in a way that encourages compliance with the principles of data integrity.
Data quality is a comprehensive term that – besides data integrity – includes relevance, timeliness, precision, correctness, completeness, credibility, traceability and confidentiality of data. It is used to explain the value of data to a business and refers to the integration of proper data, metadata, automatic transfer, archived data and searchable data.
Anything that makes data less valuable reduces data quality. One component of data quality is data integrity.
Examples of data quality include using correct data from calibrated instruments, users complying with the authorized process or Standard Operating Procedure, correct and unique user authentication, that the proper chemicals are used and that there are no data errors when generating data.
What is complete data?
Data means all original records and true copies of original records, including source data and metadata and all subsequent transformations and reports of this data, which is generated or recorded at the time of Good Practice activity and that allows the full and complete reconstruction and evaluation of the GxP activity.
Data should be accurately recorded by permanent means at the time of the activity. Data may be contained in paper records (such as worksheets and logbooks), electronic records and audit trails, photographs, microfilm or microfiche, audio or video files or any other media whereby information related to GxP activities is recorded according to ISO 9000:2015.
Complete data includes raw data and observations generated over the course of an analysis, associated contextual metadata and audit trail events if a computerized system is used.
How much data is wrong data?
Source: R. D. McDowall Limited 2016
According to Dr. Bob McDowall of McDowall Consulting, of all the quality data we produce, around 92 to 96% of that data is correct. From that 4-8 % of incorrect data, approximately 95% comes from some form of poor data management and data integrity practices.
Only once I began working at Mettler Toledo and started learning about such strong regulatory compliance did I truly begin to realize the cost, effort and risk management that is required to ensure traceable, accurate and integral results.
This is truly a global issue, not confined to just one single area of science – it goes across the entire value chain and is the cause of lost time for almost all companies. It is also not only just a GMP issue. Good laboratory practice and good clinical practice inspections also focus on critical aspects of ensuring data integrity.
What is metadata?
Metadata is data that describes the attributes of other data and provides context and meaning. It typically describes the structure, data elements, inter-relationships and other characteristics of data, such as sample identification, date, time, study number, and technical properties (e.g., instrument, calibration history, SOP, method version, audit trail, user, etc.).
Image Credit: Mettler-Toledo - Titration
Metadata is the contextual information required to understand data. A data value is by itself meaningless without additional information about the data. Metadata is often described as data about data and is structured information that describes, explains or otherwise makes it easier to retrieve, use or manage data. For example, the number “23” is meaningless without metadata, such as an indication of the unit “mg.”
Among other things, metadata for a particular piece of data could include a date/time stamp for when the data was acquired, a user ID of the person who conducted the test or analysis that generated the data, the instrument ID used to acquire the data, audit trails, etc.
Data should be maintained throughout the record’s retention period with all associated metadata required to reconstruct the CGMP activity (e.g.§§ 211.188 and 211.194).
The relationships between data and their metadata should be preserved in a secure and traceable manner. Metadata forms an integral part of the original record. Without metadata, data has no meaning.
What are some common issues with lab instrument metadata?
Laboratory analysts must often comply with standard operating procedures (SOP) for each analysis and document the entire process as well as record the results.
While many labs have turned towards LIMS and ELN systems with the idea of replacing the manual workflow, these systems are designed primarily to collect result data from an array of analytical tests.
As many companies have discovered, workflows behind benchtop analytical instruments (such as balances, titrators, pH meters and similar instruments) and their associated metadata are much more complex than just the transfer of a few parameters.
Complicating matters further, regulatory and standards such as FDA (21 CFR Part 11), EU (Annex 11), GMP and ISO (ISO 17025) have recognized both the advantages and limits of electronic data systems and are increasingly establishing further controls for the use of such systems all the way down to benchtop instruments.
So the goal of reducing errors, simplifying processes and reinforcing compliance can become further challenging when trying to directly integrate and automate the lab.
When assessing the traceability of the measurement, what are some questions to ask?
Two of the most critical steps in KF analyses are the weighing on the balance and the water standard used for calibration.
Thus, some of the most commonly asked questions include:
- Was your balance properly and recently calibrated?
- Have you activated internal adjustment?
- With which system did you store the daily testing log?
- Was it hybrid with paper and electronic records?
- If so, how did the data get from the balance to the electronic record?
- Did you consider your measurement uncertainty or minimum net sample weight when doing a back weighing process for KF?
- Did you know, for example, that the measurement uncertainty for back weighing a syringe for a water standardization is 1.4 times higher than for a standard weighing of a powder?
- Which method did the user use? Did it have the correct, validated calculations attributed?
- How did the KF reagent and water standard information enter the KF titrator? Was it copied manually from the label?
- Does it require four eyes to check all standardizations and data entry?
These questions are perfectly valid and a real example of why data integrity, particularly the attachment and traceability of metadata, is critical across your entire Karl Fischer workflow.
Where should data transfer begin?
Capturing data at the point of origin is one of the most critical steps of ensuring data integrity. Send the sample information from LIMS or the production system electronically to the balance, or as a minimum, use barcodes or other transcription-free methods.
Image Credit: Mettler-Toledo - Titration
You should also ensure that your process is followed with a mix of the proper, sturdy controls, guaranteeing SOP technical adherence through technical, procedural and admin controls. For example, use RFID readers to read KF reagent data to the titrator, which is then referenced as metadata – a single point of entry, traceable and secure.
To ensure integrity, data should also be transferred electronically throughout the sample lifecycle, and it should be stored correctly and securely.
Integrated solutions start at the origin of the data. The benefits of which are that a piece of information is added once, data is available throughout the system, the movement of data/information is continuous from the start of a process to the end without the need for a manual effort such as transcription and data is stored and secured throughout the required retention period.
At the end of the day, an integrated system results in a better work environment.
How does increasing control reduce risk?
Increasing the instrument-based technical controls can reduce procedural and/or administrative control workloads. For example, the integration of your SOP into the instrument is facilitated by an increase in technical controls, which in turn reduces procedural control overhead.
The natural result of us increasing our technical control capabilities is that it reduces the burden on providing procedural controls to cover any gaps – the overlap is simply greater, and this ultimately reduces the risk of delivering wrong data.
Essentially, the closer to ALCOA+ and data quality principles you get, the easier the daily life of you and/ or your colleagues or employees becomes.
Image Credit: Mettler-Toledo - Titration
What are some enhanced workflow solutions?
“Smart” RFID technology brings error-free analyses to the laboratory. There are two types of “Smart” RFID technology,” SmartSample and SmartChemicals.
SmartSample brings weighing and titration together wirelessly. Sample weight and sample ID are two critical parameters to ensure accurate and traceable results. SmartSample also enables the transcription-free transfer of sample data using a smart-tagged titration beaker.
Image Credit: Mettler-Toledo - Titration
SmartChemicals are a simple tap for transcription-free titrant information. The labels of SmartChemical enabled bottles carry all of the information and RFID enabled Burettes ensure that the sample data stays with the titrant bottle.
SmartChemicals are globally available from Honeywell (Hydranal™).
Image Credit: Mettler-Toledo - Titration
How does electronic data transfer reduce errors in workflow?
A contactless transfer of data from the KF-titrant bottle to a Mettler Toledo titrator instantly transfers all relevant information of a Karl Fischer titrant, that is, titrant name, the nominal concentration, e.g., 5 mg/mL titrant, the lot and batch number, its shelf life and its expiration date.
Image Credit: Mettler-Toledo - Titration
Image Credit: Mettler-Toledo - Titration
Electronic data transfer eliminates mistakes. RFID technology again supports the user and allows for seamless transfer, storage and secure handling of analysis data.
On the one hand, with SmartSample, you have the chance to eliminate all mistakes related to the transcription of sample mass during the weighing process. On the other hand, all Karl Fischer Titrant chemical data is stored without errors using SmartChemicals.
Image Credit: Mettler-Toledo - Titration
What are some advantages of LabX® laboratory software?
With LabX™ laboratory software, users can streamline processes and reduce system maintenance. LabX manages your instruments, resources, SOPs, tasks and users centrally and distributes information to connected instruments.
Image Credit: Mettler-Toledo - Titration
You can reduce your efforts for validation, maintenance and support of your computerized lab systems whilst ensuring smooth operation of your lab systems, all with a single harmonized user interface.
You can also seamlessly integrate lab systems into ELNs, LIMS or ERPs via the single LabX interface that covers complete workflows. This is not only for Karl Fisher titration but also for your Mettler Toledo automated titration systems, excellence balances, UVVIS spectrophotometers, density meters, refractive index instruments, melting point determinations and pH measurements, too.
About Simon Taylor
Simon joined Mettler-Toledo UK Ltd. in 2011 as a sales specialist in the UK, and in 2012 moved to Switzerland to take the role of Global Analytical Balance Product Manager. In 2017, Simon moved to METTLER TOLEDO ANALYTICAL to lead the Titration product marketing and management team.
Simon has over 20 years of experience in analytical chemistry, from the beginning as a QC Chemist at Pfizer, Sandwich, to scientific instrument sales, and now develops leading analytical instrument solutions at METTLER TOLEDO.
This information has been sourced, reviewed and adapted from materials provided by Mettler Toledo - Titration.
For more information on this source, please visit Mettler Toledo - Titration.
Disclaimer: The views expressed here are those of the interviewee and do not necessarily represent the views of AZoM.com Limited (T/A) AZoNetwork, the owner and operator of this website. This disclaimer forms part of the Terms and Conditions of use of this website.