The lab of the future won’t be bounded by walls.” This statement, a distillation of the views expressed at the recent Lab of the Future congress, was delivered by the event’s keynote speaker, Bryn Roberts, PhD, senior vice president, global head of operations, Pharma Research and Early Development, Roche. He went on to explain that connectedness is being enhanced by a range of technologies. Connectedness, he emphasized, will allow vast amounts of data to be gathered outside of the classical laboratory setting.

Roche’s Pharma Research and Early Development unit
At Roche’s Pharma Research and Early Development unit in Basel, establishing the “lab of the future” is a matter of promoting collaboration and innovation. To do so, Roche is adopting the FAIR (findable, accessible, interoperable, and reusable) data concept; embracing design principles such as transparency, flexibility, and modularity; and deploying state-of-the-art digitalization and automation. Roche anticipates that these initiatives will drive productivity, improve data quality, and maintain a healthy working environment.

The Congress, which was held at the Wellcome Genome Campus in Cambridge, UK, gave Roberts the opportunity to develop the connectedness theme with real-life examples. He discussed the PASADENA trial, a Phase II trial that is evaluating whether prasinezumab, a monoclonal antibody that targets α-synuclein, can slow or halt disease progression in Parkinson’s disease. In this trial, participants are using “smart” devices to remotely monitor activities of daily living, such as sitting-to-standing transitions. Roberts also mentioned the V1aduct trial, a Phase III trial in which adults with autism are receiving wearable devices that help them monitor interactions with family members, to study how social and communication behaviors may be improved by balovaptan, a vasopressin receptor antagonist.

“Using these wearable and mobile technologies,” Roberts noted, “means that we can gather clinical data continuously without patients having to come into a trial center or hospital so often, reducing the burden on subjects and generating powerful real-world-relevant data.”

An example of connectedness at the drug discovery and development stage touched on at the conference by Pieter Peeters, PhD, senior director, Computational Biology, Janssen Pharmaceutica, is the MELLODDY (Machine Learning Ledger Orchestration for Drug Discovery) project. Set up in 2019, the project is being undertaken by a consortium of 17 organizations including 10 pharma companies, 2 universities, 4 subject matter experts, and 1 artificial intelligence (AI) computing company.

“The aim of the project is to use machine learning methods on the chemical library and pharmacological data sets from 10 pharma companies in a virtual environment to develop more accurate predictive models,” Peeters said. “This will enable us to determine which compounds could be promising in the later stages of drug discovery and development, without giving away competitive data, which are a pharma company’s crown jewels.”

There were also discussions on how the connectedness of social media can be used to gather data on disease symptoms. Michael Shanler, PhD, research vice president at Gartner, explained: “Certain groups of patients, for example, those with rare diseases, are more comfortable openly sharing their medical data.”

Adding to the social media discussion, Peeters said, “People will often post things on Facebook that they don’t think are worth discussing with their physicians, yet these things are often having an effect on their quality of life on a daily basis.”

According to Roberts, social media can be mined to advantage. “Using social media, Roche has identified additional symptoms in Parkinson’s disease,” he noted. “[These symptoms] were previously not included in our clinical trial disease models. We are now including them to help guide our clinical endpoints.”

Integrated automation

Automation already plays a key role in today’s labs, and this is not going to change in the future. Delegates at the conference all agreed that biopharma companies will continue to move toward the concept of Lab 4.0, where all lab infrastructure and equipment are connected by an internet of things (IoT)—software, sensors, and actuators—that enable remote controls and the seamless exchange of data.

Besides describing the types of automation that are being used to control everything from chemical synthesis to drug screening, a speaker representing AstraZeneca emphasized that automation technologies can work in harmony with people. The speaker, Mark Wigglesworth, director of high-throughput screening at the company, maintained that in drug screening, progress depends on modular, flexible units. He said that the company’s drug screening facility in the UK is based on skids that can move different instruments in and out of a screening platform.

“We worked with vendors including HighRes Biosolutions and LabCyte [now Beckman Coulter Life Sciences] to create a high-throughput screening lab in the UK around a platform we call NiCoLAb, which uses acoustic technology to transfer nanoliters of compounds into assay plates,” Wigglesworth detailed. “Since we have integrated all our automated devices around a central control architecture, we can move equipment in and out when we need it for different screens. It is also designed with yellow band and laser sensor technology adapted from the car industry, which means scientists can interact with the robots without guards in a safe manner.”

According to Wigglesworth, using the NiCoLAb platform, scientists can test 80,000 compounds a day for different diseases and can screen AstraZeneca’s vast compound collection in four to six weeks, which represents a 60-fold increase in throughput from the company’s previous screening technology. “We have increased our productivity so much,” he declared, “that we now run around 20 high-throughput screens for external life science organizations.”

Building on its screening lab success, AstraZeneca has turned to integrated chemical synthesis to automate design-make-test-analyze cycles. This work, which has been undertaken by the company’s iLAB group in Gothenburg, Sweden, encompasses automated scale-up synthesis and the integration of systems from Tecan, TTP Labtech, and Zinsser. According to Michael Kossenjans, PhD, the head of iLAB, a range of syntheses can be performed, and compounds can be purified and ready to screen in just one hour.

Integrating automation is often a complex task, as many equipment manufacturers work in isolation and require innovators at biopharma companies to come to them with an idea and act as collaborative brokers. For example, Jonathan Wingfield, PhD, principal scientist, Discovery Sciences, R&D at AstraZeneca, discussed how his lab had worked with LabCyte and Waters to integrate acoustic dispensing via a system of capillaries to a mass spectrometer (MS) to produce an automated high-throughput MS system.

“Traditional mass spectroscopy is a great way of finding out what’s going on in biology,” explained Wingfield. “But because of the amount of sample it requires and the time it takes to do the analysis, it is generally used for only a small number of samples. By developing an integrated high-throughput platform which uses small sample amounts, we can generate 100,000 data points every 20 hours, which would be impossible using traditional MS protocols.”

For a glimpse of the future, Roberts discussed how Roche facilities under construction in Basel, Switzerland, and Shanghai, China, will enhance flexibility and enhance collaboration. According to Roberts, modularity and a digital planning system will enable sharing and rapid cost-effective repurposing of infrastructure. Labs of the future, he insisted, will be much more heavily controlled by IoT.

“We have laboratory management systems and autonomous vehicles to automate lab supplies by interacting with stores and procurement systems to deliver consumables just-in-time and re-order reagents when they are running low,” Roberts elaborated. “In case of contamination in a cleanroom, for example, we will be able to automatically identify operatives, who may have been exposed, and block them from entering other sensitive rooms for a period of time.”

As well as automated systems, there will also be an increase in the use of wearable technologies that enhance performance of scientists in wet lab experiments. Roberts cited an example of how Roche scientists are running experiments in high-containment labs using augmented reality safety glasses and voice control to maintain consistency of processes, enhance data quality, and facilitate health and safety measures.

“These technologies are very powerful for reducing experimental errors and automating data capture,” Roberts asserted. “We have also realized significant productivity gains in some settings by providing scientists hands-free capture of data and experimental records.”

Digital vs. digitalization

Conference speakers agreed that the biopharma industry is moving toward making sure all data and results are captured electronically with wide implementation of electronic lab notebooks and databases. The speakers, however, hastened to point out that they were referring to the digital lab, not just the digitized (or paperless) lab.

The distinction between digital and digitized was explained by Chris Waller, PhD, vice president and chief scientist at EPAM Systems, a provider of digital platform engineering and software development services. “Digital just collects data in an electronic format,” he said. “Digitalization includes data analysis to fully leverage the value of the data for decision making.”

To illustrate applications of digitalized data, speakers described how data may be used to generate digital twins of processes and build virtual models of bioreactors and cell culture production processes. Such models can help scientist determine optimal cell harvest times, for example.

What is slowing digitalization is the lack of a common operating system, or what Roberts calls an “orchestration backbone.” Such a system could enable life science vendors to plug into processes directly, using open data standards and application programming interfaces.

This point was reinforced by Mark Buswell, PhD, vice president, Discovery, Development, and Laboratory Systems, Information Technology, GlaxoSmithKline (GSK). Projects that encompass automation and information technology, he said, are 20% selection and 80% delivery. “Every vendor that comes to us now with systems has to publish their data in an open format,” he added.

Currently, many laboratory systems are producing output in comma-separated values and Excel (or worse, proprietary file structures), which can oblige scientists to spend time being “data wranglers,” obtaining data in user-friendly formats, before they can do any meaningful analyses. It also means that scientists are becoming increasing reliant on data analysts, and this can cause its own set of issues.

Steve Martin, PhD, vice president and head of Biopharm Molecular Discovery at GSK, explained: “Data scientists and lab scientists don’t naturally speak the same language. It’s critical that they work closely together so that data analysts can understand what insights the lab scientists are trying to draw from their data, and the lab scientists can see where automated analytics can help.

“When this partnership works, it can bring big benefits. I’ve seen months saved on project timelines through streamlined data flow and analytics.”

Closing the skills gap

It’s clear that the take-home message from the Lab of the Future conference is that despite all the technology and automation available, lack of communication and scientists skilled in the right disciplines may be the stumbling blocks to advancing the vision of innovative drug discovery and development.

“Data is becoming much more important, yet scientists with data analysis skills and biological or chemical backgrounds are very rare,” Martin stressed. “Getting these two groups to talk to each other in the same language is a problem.

“With increasing automation in the labs, the more physical jobs will be replaced with jobs of higher intellectual skill levels. In labs of the future, we will need people who can maintain and run automation, as well as analyze data. This is currently not a skill set that is in high supply, so universities will have to realize this and start to train appropriately.”

Previous articleConvergence Fends Off Malthus
Next articleSuper-Resolution Microscopy Comes into Focus