Digital twin (DT) models of processes are a path to efficient drug production as long as industry has the right data capture and analysis tools, according to a new study in which author Krist Gernaey, PhD, professor, and colleagues at the Technical University of Denmark looked at how in-process data can be used to create digital models of production processes.

The key to successful use of a digital twin is to establish a feedback system where information from the model is used to optimize the process, notes Gernaey.

“The main benefit of a digital twin is the fact that it should allow you to create value from data collected on the process, especially the on-line data,” he says. “We make a difference between a digital shadow and a DT. A digital shadow of a process receives data from the process and generates new data through calculations made with these data, but there is no feedback to the process.

“In a DT, one collects data, one uses these data for example to make predictions about the future state of a process, and then one also makes on-line adjustments to the process based on these predictions. So, potentially, the DT concept should lead to a more optimal, more efficient, improved production process by making better use of the data that are routinely collected on the process.”

Building a twin

Building a digital twin can be a daunting prospect for a biopharmaceutical company that lacks an established IT infrastructure, according to Gernaey.

“To me the main challenges are that you need to put infrastructure in place for real-time handling of data—implementation of AI tools, creation of a data lake, cloud, Internet of Things technology—and then take a critical look at available measurement equipment and potentially invest in additional measurements,” he explains. “Furthermore, one needs to have staff that can support implementation of the digital twin.”

Fortunately, researchers have established common steps that can guide companies through the process, continues Gernaey.

“The first step is mapping of data currently collected on a process, before defining a realistic target for the DT,” he says, adding “the next stage is the installation of additional measurement equipment depending on whether the model is mechanistic, data-driven, or a hybrid of both.

“Following implementation and validation of the model, it is important to use new process data to ensure that predictions of the DT are realistic and are in line with measurements on off-line samples that are analyzed during such a validation phase.”

Small steps

It is also a good idea to start small, advises Gernaey.

“One can develop a DT for a single unit operation or for a whole process,” he tells GEN. “Of course, complexity of a twin applied to a single unit operation is usually much lower compared to one covering a whole process. I would suggest starting at the unit operation level, to build experience and gain trust in the concept.”

There other advantages of starting small and implementing a digital twin in a stepwise manner, Gernaey says, citing maintenance as an example.

“Once data are available in the cloud, preventive maintenance will focus on detecting abnormal patterns in data, which could potentially indicate the malfunctioning of a piece of equipment,” he points out.

“When that is detected, maintenance can be scheduled, instead of experiencing an unexpected breakdown of equipment. Preventive maintenance does not necessarily qualify as a full-fledged digital twin, but it is one first step on the way to gaining experience with such concepts.”

Previous articleOff Message: Dyadic Fights COVID-19 with a Fungus
Next articleHow SUNY Poly Builds Bioprocessing Partnerships