Digital-twin technology often appears in discussions of Bioprocessing 4.0. At the Bioprocessing Summit in August, Maximilian Krippl, PhD, head of process modeling at Vienna-based Novasign, gave a presentation called “How Digital Twins Facilitate the Factories of Tomorrow: Current Obstacles and Solutions for the Biopharma Industry.” To find out more about this area, GEN talked with Krippl about this work.

“Digital twins as in-silico bioprocess representation are still in an exploratory phase,” Krippl said. “Academia and industry alike realize the benefits of digital twins, but the objectives are different, due to the lack of a definition and different visions and interpretations.”

As he explained, a digital twin might focus on various aspects of bioprocessing, such as data management, stakeholder involvement, or process prediction, but it is important to solve one aspect at a time. In particular, Krippl emphasized digital twins for process prediction and monitoring. He said that these features “will create a solid foundation to face the challenges in bioprocessing, such as batch-to-batch variability, out-of-specification runs, and consistency in highly specialized bioprocesses, such as those for gene or cell therapy.”

Difficult development process

In any bioprocessing application, digital-twin development is difficult. “Creating digital twins for bioprocesses is a highly interdisciplinary task that requires experts from bioprocessing, modeling, and automation,” Krippl continued. “Each party faces their own challenges.”

For example, bioprocess engineers analyze the critical process parameters (CPPs) that impact the critical quality attributes (CQA) of a product. “A robust and well-understood process is the best prerequisite for the modelers,” Krippl said. “The modeler’s goal is to develop mathematical models that grasp the interplay between CPPs and CQAs within a desired design space.”

Automation engineers then develop control strategies based on the results of the models. To develop control strategies, “the model must be based on process variations introduced during early to late-stage process development,” Krippl pointed out. “This allows the engineer and the model to differentiate between acceptable and nonacceptable CPP combinations.”

To make the most useful digital twin, the work starts in the development stage of a bioprocess. Consequently, “the major challenge for digital twins in drug bioprocessing lies in starting to perform runs and collect data in an early stage in order to apply it successfully at manufacturing scale,” Krippl explained. “A digital twin trained on setpoint data from a manufacturing run is not able to detect unfavorable conditions and won’t be able to correct the process trajectory in case of process deviation.”

As he added: “Only with the variation in the data is it possible to detect and counteract unwanted process outcomes.”

In the face of so many objectives for digital twins in bioprocessing, much work remains to be done. “We are still far away from full implementation of digital twins that incorporate all of the above-mentioned aspects, such as data management, process prediction and control, and stakeholder interaction,” Krippl said. “In my opinion the biggest advantage and also beauty of a digital twin lies in its natural integration during process development along the product life cycle.” He added: “A clever process optimization strategy with the right sampling and measurements already paves the way for the implementation of the digital twin for control purposes during scale-up and manufacturing.” That can be done, according to Krippl, without an unmanageable increase in experimentation in early development of a bioprocess.

To get the most from this technology, Krippl said: “The keys to making digital twins a reality are breaking down data silos, communication between stakeholders, and creating a common culture and understanding.”

Previous articleKeeping an Eye on New Bioreactor Monitoring Technology
Next articleThin Film Freezing Allows Deep Lung Delivery without Excipients