August 1, 2018 (Vol. 38, No. 14)

Balancing Bioreactor and Process Parameters Avoids Production Discord and Amplifies Flows

When developing, configuring, or operating a bioreactor, one should avoid “pulling out all the stops,” as the saying goes. To understand why, one needs to know that in this saying, “the stops” originally referred to the control elements of another instrument, the pipe organ. Pulling different stops directs pressurized air through different sets of pipes, producing different sounds, not all of which work well together. Similarly, proper adjustment of a bioreactor’s controls, or manipulation of the variables that influence an upstream process, requires something like a good ear. Some combinations may produce pleasing harmonies in the form of high yields, batch by batch, or consistent output from densely packed but highly viable cell cultures. Other combinations, however, may clash, jarringly—or even worse, expensively.

GEN recently asked a number of bioprocess experts to explain how bioreactor technology and process design can satisfy the biopharma industry’s multiple and potentially conflicting desires, including greater cell densities, increased titers, larger yields, faster process transfers, and higher quality. Taken together, the biomanufacturing specialists’ comments emphasize the importance of striking the right balance, of knowing how to orchestrate bioreactor and process variables so that they reinforce each other and generate surging, pulsing, or continuous upstream flows, whatever sort of performance may be needed.

Process Intensification

The aim of upstream optimization is to develop processes that increase product titers, says Stefan R. Schmidt, Ph.D., head of production operations at BioAtrium, part of a joint venture launched by Lonza and Sanofi. According to Lonza, the joint venture will encompass a large-scale biologics production facility, which is being built in Visp, Switzerland.

“When we talk about process intensification,” says Dr. Schmidt, “there are two angles: decreasing processing times, and increasing cell densities.” In upstream processes, these angles converge on the ultimate goal of higher titers and yields. “The aim,” he insists, “is to make more product from each batch.”

Process intensification, with its emphasis on yield, can guide the earliest stages of process development. For example, yield considerations can influence the development and selection of cell lines. Process intensification can also be applied a little later. For example, at BioAtrium, Dr. Schmidt is growing higher density cell cultures while ensuring that nutrient levels are always optimized.

“Cell cultures are often overfed with glucose,” he notes. “Unfortunately, excess glucose can lessen the quality of the resulting product. In theory, establishing a feedback loop allows glucose levels to be optimized in real time. Doing so, however, requires online analytical technologies. As it happens, these technologies are being developed, particularly in the field of spectroscopic analysis.”

Process intensification is a complex task. Doing it effectively can involve rethinking the bioreactors in which culturing takes place.

“Growing cells generate heat whether they are of mammalian, insect, or bacterial origin,” Dr. Schmidt points out. “This heat needs to be managed and removed to ensure that the optimal conditions for the culture are maintained.

“Heat generated by cells is a problem for single-use, disposable bioreactors, which usually have an insulating layer. One approach to this issue is to use smaller volumes. At present, 2000-L reactors are standard, but 1000-L reactors could be of significant help when it comes to dissipating heat.”

“For steel bioreactors,” Dr. Schmidt says, “heat dissipation is more straightforward—for obvious reasons.”

When one is trying to increase cell densities, one must also consider aeration, which can be impeded by the particulate matter, which is abundant in cultures grown at higher densities. “To address this issue,” Dr. Schmidt asserts, “it is necessary to find a way of mixing the culture in a manner that does not disrupt optimal growth.”

Fortunately, technology developers are keeping pace with efforts to intensify upstream processes. “Bioreactors and associated technologies are continually being improved,” Dr. Schmidt observes. “And suppliers are normally very responsive to end-user demands.”

“Pall is a good example,” he continues. “The company has been very proactive when it comes to acquiring technologies that facilitate continuous processing, which is a fast-emerging area of demand for the biopharmaceutical industry.

“Repligen, too, is very good at facilitating process intensification. Although the company does not offer a wide range of bioprocessing technologies, it has been very active at implementing measures that facilitate short run times.”

A Learning Process

Monitoring the many variables that impact conditions in bioreactors generates data—a lot of data. This information is key to ensuring that proteins and monoclonal antibodies generated by upstream processes are of the desired quantity and quality.

Data from bioreactors can also be used for optimization, suggests Wei-Chien Hung, Ph.D., a process development scientist at Alexion. According to Dr. Hung, a data analysis method called machine learning (ML) can be used to tweak production processes.

ML, a subdiscipline of artificial intelligence, refines automated decision-making processes by enabling them to optimize themselves through the statistical analysis of data. With ML, processes effectively train themselves.

“We use ML to identify the parameters that have the biggest impact on product quality and to help reduce ‘noise,’” Dr. Hung says. “ML helps us to create more accurate predictive models.” Such models, Dr. Hung continues, indicate that parameters such as ammonia, glucose, and glutamic acid have the greatest impact on product attributes such as titer and sialic acid content.

The first stage of the ML process is to collect data for all the parameters that characterize the operations of a bioreactor. Once sufficient information has been collected, it is fed into various ML algorithms, which include decision tree, random forest, and naive Bayes algorithms. Then the “learning”—the analytical model building—begins.

“These algorithms are kind of like a black box—you input the data and wait for the results,” Dr. Hung remarks. “Over time, we found that the decision tree was the most effective of the algorithms in terms of the accuracy of the model produced.”

The more data, the better the learning process, Dr. Hung and his colleagues learned. “We decided to apply ML to help us build better models of our upstream processes because the product in question had a low titer,” he recalls. “However, what was initially a disadvantage became an advantage because it required that we collect data from multiple production runs, which ultimately helped us to produce more accurate predictive models.”

The main application of these models is to optimize conditions within the bioreactor. These models, Dr. Hung adds, can also be applied to other parts of the production process.

“One of the ways we use the models upstream is to streamline the use of raw materials,” Dr. Hung points out. “When we have established the most important parameters, we no longer need to prescreen raw materials.

“Previously, when we still had to prescreen raw materials, we used upstream culturing at a small scale. But with our ML models, we can be confident that we have already identified the most important parameters in terms of the quality of the final product, without screening.”

Another advantage of the approach is cost reduction. “Our analysis,” says Dr. Hung, “suggests that the approach saves us around $200,000 per production run, based on shorter running times and lower labor costs.”

Computational Fluid Dynamics

A computational approach can also be used to optimize upstream processes during scale-up, says Michelle LaFond, senior director, bioreactor scaleup and development, Regeneron Pharmaceuticals. LaFond uses computational fluid dynamics (CFD) to identify laboratory-scale culture parameters that will be hardest to replicate at manufacturing scale. These are known as scale-dependent parameters.

Current approaches to identifying scale-dependent parameters—which are based on correlations for power per unit volume (P/V) and stripping gases—assume that the production bioreactor will simply be a bigger version of the laboratory bioreactor. Although these approaches are effective when laboratory and production bioreactors are like-for-like in terms of impeller and sparger geometries, they are insufficient when bioreactor geometries differ.

The inability to identify scale-dependent parameters, warns LaFond, results in tinkering during scale-up, leading to lengthy and therefore costly technology transfer.

“CFD, in contrast, allows us to leverage thermodynamic principles and develop a more comprehensive, high-resolution understanding of the fluid properties within a bioreactor,” she explains. “It also leads to faster data generation and decision making due to minimized reliance on empirical data.

“By implementing CFD, the impact of different bioreactor designs is mitigated. CFD also allows us to simulate conditions in the bioreactors for various agitation rates and generate a library of engineering parameters such as P/V, energy dissipation rate, mixing times, and shear. While some of this can be done empirically, the simulations help us to cover more ground and obtain more granular data by modeling localized fluid properties rather than just the mean response that you get from bioreactor experiments.”

LaFond and her colleagues use the data to make predictive scale-down bioreactor models to determine optimal operating conditions at the pilot scale in accordance with quality-by-design principles.

“The key benefit is a seamless transfer to manufacturing, obviating the need for process modifications or investigations at scale,” LaFond declares. “This is a direct result of the data-driven process transfers to commercial scale, with more of the effort and resources being spent in characterizing the process during development.”

Disposable Technology

As bioprocess development progresses from stage to stage, it may switch from one bioreactor system to another, so that at each stage, the most suitable bioreactor system is used. This kind of flexibility is becoming more common as the number of specialist technologies grows.

For example, while Bristol-Myers Squibb primarily uses large-scale bioreactors for commercial manufacturing, the company conducts some of its early process development, specifically clone selection, in disposable systems.

The idea is to use the most effective technology for each unit operation, comments Ping Xu, Ph.D., a senior scientist in Bristol-Myers Squibb’s biologics development and supply division. According to Dr. Xu, the company uses Sartorius Stedim Biotech’s ambr 250 system for clone selection and early-stage biologics process development.

“The instrument has been integrated into our platform working flow to select the lead clone from 6 or 12 research cell bank clones,” he says. “After the lead clone is selected, the ambr 250 system allows us to optimize the process. In the past, we used shake flasks and 5-L bioreactors in parallel for screening clones and process development.”

The advantages are manifold, asserts Dr. Xu, who points out that one employee using the disposable technology can achieve in a day what a team of three employees using the firm’s previous approach would have accomplished only after several days.

The ambr 250’s advantage with respect to time savings is due to the system’s fast turnaround, asserts Dr. Xu, who adds that this advantage is especially important to scientists “facing tight timelines for achieving first-in-human applications for new biologic molecules.”

“The consumables for the ambr 250 are slightly costlier than those for 5-L bioreactors,” Dr. Xu notes. “But the ambr 250 also reduces labor costs, so it is still valuable in process development.”

All of the bioprocessors interviewed for this article will be speakers at Cambridge Healthech’s Bioprocessing Summit, which will be held from August 13–17 in Boston.

Previous articleDNA Crosslink Repair Central to CRISPR Insert Success
Next articleCancer Cells Feel Acid’s Sting