Share

TOP 5 trends for lab digitization and automation in 2019

In a fast-paced environment, the pressure for innovation and efficiency is leading to laboratory digitization and automation as significant tools of lab practices. Currently, in Industry 4.0 (known in the scientific realm as the fourth industrial revolution) it is now more crucial than ever to achieve rapid solutions to arduous tasks in the laboratory.

For RD projects and other scientific research that require seemingly endless iterations, the task of handling large items of research becomes significantly more difficult in a paper- and manually-led environment. Hence the need for laboratory digitization and automation.

Free image via Pixabay

Laboratory robotics

Robotic processes that automate repetitive human work have already been put in place to increase throughput in the laboratory. There are many robots that ease the workflow in the laboratory, by using a pipetting robot to replace a simple task or by replacing an entire process with automated workstations such as the Tecan Automated Workstation.

Moreover, anthropomorphic robotic arms can assist the human scientist. Such arms are flexible enough to carry out well-established and validated processes like; bottle opening, flask shaking and decanting. The Society for Biomolecular Sciences’ standard microtitre plate format has helped innovate the handling of liquid samples. Two fingered grippers and arms can transport well-defined items swiftly and reliably. However, a significant lack of agility has hindered many systems.

The next generation robots are collaborative robots, ‘co-bots.’ Such robots can work in close proximities to bench scientists due to their force-sensing capabilities and light barriers that prevent collisions. Expanding on the cooperation between robots and experts, the role of the human craftsman to make small batches or unique parts of small-scale craft in manufacturing has been passed onto robots, allowing scientists to focus on more challenging bench work.

Robotic systems have been designed to learn tasks from small sets of examples and perform tasks autonomously. Such robots are programmed to detect problems and resolve errors, guiding users to recover and continue, removing the need for scientists to code instructions. This may sound like a dystopian case of robots taking over scientists’ jobs but in reality, it allows scientists to reach conclusions more rapidly by freeing up time on monotonous tasks.

Device Integration

Laboratory device integration can be used to link individual lab instruments to a larger system, replacing the archaic mechanism of manual data processing. Moreover, devices can be integrated to automate workflows. This can be done by creating a work order through an automation manager with a set of laboratory devices to be able to perform tests and retrieve results from such processes.

For example, Cubuslab’s device integration system reduces processing time by automating workflows. One Cubuslab connector allows the connection of up to four devices with a digital data interface. Each connector in the system communicates bi-directionally with a central server or the cloud, where data is automatically processed. Such cloud-connected equipment can allow lab workers to schedule equipment-use in advance and pre-plan experiments via their computers or smartphones.

Device integration can help to automate barcoding; liquid handlers, Laboratory Information Management Systems (sample tracking, storage retrieval), sample analyzers and high-throughput screening and robotics, but the advantages of device integration are still unrecognized by many.

Electronic Lab Notebook (ELN)

As scientific research is becoming more focussed on analyzing big data, the importance of a digital platform to collate this information is imperative. The scientific community predominantly uses paper lab notebooks to store data, which makes it almost impossible to execute big data analysis. Nevertheless, there is a growing platform of ELNs that target the root of this problem by switching from paper to a digital platform for storing experimental data. labfolder has an informative ELN Guide that highlights the advantages of ELNs, as well as a comparison of the top ELNs in the market.

At the core of every research is the lab notebook and this resource is also one of the main culprits of data integrity problems. Lab notebooks need to keep up with the digitization of the laboratory environment. ELNs standardize processes for scientists to record and retrieve their research electronically while maintaining crucial data compliance requirements like full audit trails and encryption.

The majority of ELN providers also expedite collaboration by allowing multiple scientists to combine their research and discussing it in real time. In combination with devices and software integrations (mainly via API), ELNs are now moving towards the direction of operating systems for the laboratory that allows for the management of laboratory instruments and data repositories.

Cloud computing

Demand for cloud computing has risen dramatically in the last decade due to significant improvements in computational power from increasingly large data storage facilities. Alongside large research projects, small-scale laboratories can securely store their data without having to own their own data storage facility, all thanks to cloud storage. Cloud computing is also crucial for laboratories to share their data securely with international researchers, providers, and investors. The improved security within the cloud has boosted trust with regards to sharing data, amongst scientists.

Zenodo, funded by the European Commission, has provided a foolproof method of safely storing large sets of research output (up to 50GB per dataset) in the same cloud infrastructure as research data from CERN’s Large Hadron Collider (LHC) using CERN’s impenetrable repository software, Invenio. Invenio is also used to house CERN’s Open Data Portal. This open access data store provides open access to real collision event data from the LHC experiments in the hope to stimulate the creativity of external users.

Big data analysis

We live in a “data rich – information poor” world, where data access is easy but access to actionable information is not. Laboratories produce significant data that is locked in different information management systems that obstruct time management in the lab. Especially with regards to Quality Testing labs which prioritize efficiency and rapid turnaround times, big data analytics can be a very useful tool.

Novartis and Oxford University have very recently begun a partnership in order for the Swiss pharma company to use the University’s Big Data Institute (BDI) to research treatments for inflammatory diseases such as multiple sclerosis and psoriasis. Their 5-year alliance will use anonymized data from 5 million patients within the BDI, as well as all the anonymized data from Novartis’ clinical trials.

The two groups will combine their innovative IT and AI processes to find patterns in data in terms of; imaging, genomics, and clinical and biological data, which would not be found otherwise. This partnership is an example of the potential benefits of analyzing large data sets in order to accelerate research into complex topics such as inflammatory diseases.

Conclusion

The coalescence of software and hardware is revolutionizing the scientific industry. The scientific community had been trailing behind the rapid advancements of the manufacturing industry (circa Industry 4.0) in terms of cognitive computing, cyber-physical systems and automation, but not for much longer. As the interconnectivity of lab processes, AI and IT increase, the scientific community is bound to see an acceleration in research in the laboratory. As CERN have done with their open-access portal of LHC data, the scientific world may become a whole lot closer to conclusions if others follow suit.

Written by Claudia Telling, labfolder.com


<!–

Comment this news or article

–>