Digital Twins and AI Reshape Biopharmaceutical Manufacturing

Biomanufacturers leverage innovations in process control and monitoring to expedite regulatory approval of drug products

By Gareth Macdonald

The quality of a drug product is determined by the manufacturing line. Preclinical assessments and clinical trials provide an indication of safety and efficacy, but it is the manufacturing process that ultimately determines how a drug will perform as a product.

Put simply, a poorly controlled manufacturing process will result in product variability and potentially put patients at risk. In contrast, a well-controlled manufacturing process yields a consistent product that can literally save lives. This uniformity is an essential part of the chemistry, manufacturing, and control strategies that manufacturers must demonstrate to regulators.

esearcher at the Jefferson Institute for Bioprocessing installs monitoring devices
Digital twins—virtual counterparts of real-world manufacturing systems—gather information from monitoring devices to perform analytics and dynamically adjust manufacturing processes. In this image, a researcher at the Jefferson Institute for Bioprocessing installs monitoring devices to track bioreactor conditions. To derive more value from monitoring technology and improve process control, biomanufacturers may combine digital twins with artificial intelligence technology.

“In terms of securing drug approvals, clinical results alone are not sufficient,” emphasizes Mohamed Noor, PhD, digitalization manager at the National Institute of Bioprocessing Research and Training (NIBRT) in Dublin. “Regulators will want to be assured that clinical manufacturing by a sponsor can translate into robust commercial manufacturing, batch after batch throughout the lifecycle.”

But achieving process control in the plant can be a colossal technical undertaking.

“Bioprocess control is more than just automation,” says Anurag Rathore, PhD, coordinator of the DBT Center of Excellence for Biopharmaceutical Technology at the Indian Institute of Technology in Delhi. “It includes aspects such as system architecture, software applications, hardware, and interfaces, all of which are optimized and compiled per demand. This needs to be accomplished while keeping process requirements, production costs, regulatory constraints, and data acquisition in mind.”

Improving process control

The foundation of any control strategy is process understanding. And, according to the ICH’s Q8 guidance,1 modeling is the best way to generate process understanding and meet regulators’ quality-by-design expectations. The models should describe the relationship between process parameters and drug quality and performance attributes.

Statistical models—predictions based on available data—have proven to be the most popular approach so far. Many manufacturers have used data-based models to guide development, scale-up, and process control. But their predictive power is limited to the range of data available, and they require significant experimental effort.

As Angela Li, PhD, senior scientist at Sanofi, warns, “Traditional data-driven approaches using scale-down models to predict scale-up performance provide no system knowledge or physical understanding of the process, and are, therefore, error prone when used to make predictions during scale-up.”

Digital Demonstration for Innovation
The Jefferson Institute for Bioprocessing and Boston Consulting Group (BCG) recently launched the Digital Demonstration for Innovation, a digitally immersive experience that helps executives understand how Industry 4.0 technologies can drive value throughout biopharmaceutical operations. The center has been designed to showcase key Industry 4.0 technologies such as digital simulation, industrial Internet of Things, advanced analytics, and augmented reality. “The maturity of the technology and the current state of the industry have converged to create a unique opportunity to apply Industry 4.0 technologies,” said Paul Poduri, a BCG managing director and partner. “These emerging technologies, particularly when combined with lean methodologies, could lead to significant improvements in productivity.”

For this reason, mechanistic models—assumptions based on known principles rather than just data—are gaining in popularity. Mechanistic models “can provide a full description of the system, higher prediction power, as well as the potential to extrapolate well outside of calibration space,” Li explains. “They are valuable tools for predicting scale-up process performance, thereby de-risking large-scale manufacturing runs.”

Employing mechanistic models

Beyond scale-up, mechanistic modeling can help engineers better characterize—and therefore understand—production processes. For example, they can be a valuable tool for characterizing and understanding design space.

“Mechanistic models,” Li adds, “are high-fidelity models that allow the users to simulate hundreds and thousands of experiments without experimental effort, providing a far more accurate picture of the design space.”

She compares them to traditional design of experiments approaches, such as central composition design, which select only a few experimental points in the design space. “The data are spread so thin that missing one experimental point may skew the results,” she notes. “And often, complex biological processes cannot be fully described by simple quadratic equations [even though such equations are utilized] in central composition design.”

Li notes that mechanistic models are starting to be used in day-to-day operations on the factory floor. “Mechanistic models can be integrated to process automation control systems,” she details, “and together with process analytical technology, they can be used for real-time process monitoring and control. They are useful throughout the product development lifecycle, including process optimization.” She adds that they can enhance process understanding, support quality-by-design process characterization studies, and risk large-scale clinical and commercial manufacturing runs.

Twin ambitions

Li and colleagues on Sanofi’s vaccine chemistry, manufacturing, and control team used a mechanistic approach to model a chromatography process. “The model provided design-space understanding,” she relates. “In addition, the model accurately simulated a pilot scale-up chromatography run.” Besides supporting process scale-up, the model—referred to as a “digital twin”—has provided a proof-of-concept demonstration of how mechanistic models for chromatography may facilitate in silico process development and characterization.

Digital twins are in silico models of processes. They are a major research focus for Parviz Ayazi-Shamlou, PhD, vice president of the Jefferson Institute for Bioprocessing in Philadelphia, who explains that twins made during development can inform process control
and more.

For example, Ayazi-Shamlou suggests that creating a digital twin for a bioreactor would allow bioprocess engineers to design and operate a cell culture operation entirely in silico. “Once proven,” he explains, “this kind of digital twin can be used to aid process development and optimization, answer ‘what if?’ questions about the operation, and interrogate process deviations, considerably reducing experimental work, time, and costs.”

He notes, “Another important use of digital twins technology is in training next-generation bioprocess scientists and engineers, much like aviation simulators are used today in training of pilots.”

Artificial control

In some industries, the use of artificial intelligence (AI) in production has almost become the norm. According to a recent MIT survey2 of 11 sectors—including the manufacturing, consumer goods, and retail industries—quality control is the third most common use of AI.

AI has also found applications in biopharmaceutical production, albeit in a more limited way. “Most uses are focused on improved process control of individual unit operations,” explains Kiefer Eaton, co-founder of industrial software firm
Basetwo AI. These include the bioreactor- centered or chromatography-column-centered operations that execute the major upstream and downstream processes for a biologic manufacturing line.

In addition, it is possible to control an entire production process with AI, provided the right training data and IT infrastructure are available. “A manufacturer could model an entire bioprocess facility by connecting the data from each unit operation in series and funneling model outputs between unit operations to achieve what we call ‘holistic modeling,’” Eaton elaborates. Examples of holistic modeling, he notes, were discussed in a recent review prepared by Boehringer Ingelheim3 scientists.

“Such a holistic model could be used to identify bottlenecks in a production process or achieve more complex whole-plant optimization,” Eaton continues. For example, sacrificing local losses in product yield during an upstream process could lead to higher final product yield if downstream purification steps were improved (for example, by preventing aggregates or undesirable post-translational product patterns in the case of biologics).

The potential benefits of whole-process AI control are significant. A key advantage over traditional data-driven approaches is the ability to more accurately model and understand a process—even in real time.

“At Basetwo, we focus on combining machine learning or AI with engineering knowledge to build hybrid process models that can learn process dynamics better than any traditional mechanistic or data-driven approaches,” Eaton explains. “At a high level, these models leverage the power of AI in an engineering context to better learn from process data.”

Beyond improving process visibility, these models will, Eaton anticipates, enable forecasts of how a process will evolve over time. This will give manufacturers the ability to predict when best to harvest or transfect a batch. Importantly, it will also issue an alert when a batch or process is drifting out of spec, giving engineers sufficient time to ensure corrective actions are implemented before a deviation occurs or product yield or quality is compromised.

A hurdle for AI

Cameron Bardliving, PhD, director of process development and operations for the Jefferson Institute for Bioprocessing, has a slightly different take. Although Bardliving agrees that AI could have a greater role in process control, he points out that there are still product-related challenges to overcome.

Bardliving cites the “exceptionally sensitive” relationship between biopharmaceuticals and their process environments as one of the major hurdles to wider use of AI. “It is challenging to predict a priori a biologic’s critical quality attributes,” he says, “and the lack of first-principle knowledge about the structure-function-process triangle makes it difficult to create a fully predictive AI-based model that can be used in a manufacturing setting.”

Although the introduction of full AI into biopharmaceutical manufacturing has been relatively slow, much progress has been made. In addition, work continues across multiple fronts to make a fully integrated smart AI-based manufacturing factory of the future a reality. Ultimately, the goal of any biomanufacturing process is to satisfy the quality-by-design requirements instituted by regulators.

 

References
1. European Medicines Agency/Committee for Human Medicinal Products/International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use. ICH guideline Q8 (R2) on pharmaceutical development. Published June 22, 2017. Accessed July 8, 2022.
2. McCauley D. The global AI agenda: Promise, reality, and a future of data sharing. MIT Technology Review Insights; 2020.
3. Smiatek J, Jung A, Bluhmki E. Towards a Digital Bioprocess Replica: Computational Approaches in Biopharmaceutical Development and Manufacturing. Trends Biotechnol. 2020; 38: 1141–1153. DOI: 10.1016/j.tibtech.2020.05.008.