Digital Support for Lab Techs Helps Them Avoid Mistakes and Prevent Quality-Related Errors

DeckCheck™ for lab techs is a deep-learning vision system developed by CSEM and integrated in TECAN's Fluent® automated workstation. DeckCheck™ displays discrepancies such as missing plates or incorrectly loaded tip boxes and allows the operator to correct the situation.

Image Credit: CSEM

Laboratory workstations, whose liquid-handling robots automate complex workflows, still rely on the manual setup of labware. The inaccurate configuration of trays, tubes, and plates can result in ruined tests, broken kits, costly delays, and lost patient samples. Tecan set its sights on a digital solution for detecting mistakes with exceptional accuracy for any given workflow. CSEM was asked to develop a vision system powered by a deep learning algorithm for integration into Tecan's lab automation platform.

One-shot-learning, a New Deep-learning Comparison Method Used to Measure the Similarity and Difference Between Images

CSEM developed a completely new neuronal image comparison approach that can deal with vast variations in labware and decipher changes in illumination, reflections, and shadows. The DeckCheck™ system acquires and analyses images, flagging up discrepancies in labware objects and positions that it has compared to a reference image in under 1.5 seconds with 95% reliability, reducing risks and operators' workloads. The platform has been launched and its algorithm will also be valuable in other domains, such as quality control, security, and surveillance.

Play Video – Fluent Deckcheck™ 

Keeping Watch

Tecan has recently launch commercially DeckCheck™, its worktable recognition solution and are now ramping up their component and subsystem digitalization efforts towards intelligent automation.

DeckCheck™ will strengthen Tecan’s Fluent platform by providing state-of-the-art assistance in setting up and verifying workflows. It is expected to significantly reduce user error, the cause of a substantial proportion of workflow crashes. This will contribute to a sizeable decrease in sample and consumable losses, thus yielding a positive environmental impact." Dany Chek, Senior Project Manager at Tecan.

The new algorithm's potential doesn't end there, however. CSEM has already used it for a project in the field of quality control for electronic components and envisages wider applications.

As Roger Limacher, Project Manager at CSEM, explains, “The neuronal image comparison approach is also highly relevant to sectors such as security and surveillance where you want to make sure that an object remains where it should be and understand what an image of the object will look like in different light conditions.”

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    CSEM. (2023, April 11). Digital Support for Lab Techs Helps Them Avoid Mistakes and Prevent Quality-Related Errors. AZoM. Retrieved on November 21, 2024 from https://www.azom.com/news.aspx?newsID=60322.

  • MLA

    CSEM. "Digital Support for Lab Techs Helps Them Avoid Mistakes and Prevent Quality-Related Errors". AZoM. 21 November 2024. <https://www.azom.com/news.aspx?newsID=60322>.

  • Chicago

    CSEM. "Digital Support for Lab Techs Helps Them Avoid Mistakes and Prevent Quality-Related Errors". AZoM. https://www.azom.com/news.aspx?newsID=60322. (accessed November 21, 2024).

  • Harvard

    CSEM. 2023. Digital Support for Lab Techs Helps Them Avoid Mistakes and Prevent Quality-Related Errors. AZoM, viewed 21 November 2024, https://www.azom.com/news.aspx?newsID=60322.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.