May 1, 2017 (Vol. 37, No. 9)
Angelo DePalma Ph.D. Writer GEN
Minireactors, Sampling Techniques, and Statistical Analyses Can Keep Optimization Conversations Going
Crude scaledown techniques have been used since prehistoric times to investigate the inner workings of processes. Therapeutic biotechnology employs more formal platforms for process troubleshooting, modeling, and debottlenecking.
Analysis software is often necessary to model or troubleshoot scaled-down processes. Bioreactor Module, a software package for upstream process development by Genedata, provides a framework for systematic analysis of all bioreactor online and offline data, particularly for critical process parameters measured online and inline, and for offline measurements that require sampling.
“Process development for new molecules utilizes large numbers of offline measurements for initial characterization of in-process samples and end-product samples,” says Christoph Freiberg, Ph.D., senior scientific consultant, Genedata.
For example, for an ambr bioreactor with 15-mL vessel volume, data for two parameters are usually acquired online (pH, dissolved oxygen); data of one or few parameters are captured at-line (viable cell density); and additional process-related data (titer, metabolite concentration) are measured offline.
“Sampling volumes become the limitation for offline data acquisition,” Dr. Freiberg explains. Larger test volumes offer more leeway in terms of sample numbers and volumes.
“Frequently, subsets of samples are frozen for subsequent analysis,” adds Dr. Freiberg. “Their data compiled to apply multivariate statistical analysis and DOE methods for deeper understanding of process parameters and product quality attributes.”
First, critical process parameters are identified, quality attributes are defined, and a first process is established. Then, additional process analytics are applied to monitor and control the process.
“For process development, scaledown enables easier characterization of upstream processes,” notes Dr. Freiberg. “Statistical analysis of these experiments leads to a thorough definition of the ranges of process parameters, which enables prediction of high drug product quality.”
At present, the most frequently used scaledown bioreactor platform is the ambr platform (Sartorius Stedim Biotech). Other scaledown platforms include MiniBio (Applikon), Micro-24 (Pall Life Sciences), DASbox (Eppendorf), and Multifors 2 Cell (Infors HT) bioreactors. Such platforms typically include reactors with volumes below 250 mL.
Besides bioreactors, classical 48-well deep-well microtiter plates may serve as scaledown instruments. Even though such plates lack the stirring capabilities and online controls found in many scaledown bioreactors, they may be useful as first scaledown screening tools to assess cell-line clones under fed-batch growth conditions. Miniaturized bioreactors based on microfluidics, such as the Biolector (m2p-labs), are not optimal for mammalian cell processes, comments Dr. Freiberg.
Genedata has also launched an update for its mass spectrometry (MS) software platform, Genedata Expressionist. MS is not used extensively for scaledown work—yet —but it provides simultaneous monitoring of multiple product quality attributes, plus metabolite and nutrient data.
Andrea Amantonico, Ph.D., scientific consultant, Genedata, explains why MS still has little uptake in scaledown work: “MS data treatment, analysis, and reporting remain a complex task that requires considerable time investment of highly skilled specialists. Industry has recognized that successful implementation of MS-based analysis in routine bioprocess characterization necessitates the use of one central enterprise software platform for data processing and analysis across R&D and manufacturing.”
The volume issue was likely the justification for Sartorius Stedim Biotech’s introduction of the ambr 250 modular mini bioreactor system for parallel fermentation or cell culture. The 250 uses up to eight single-use bioreactors with working volumes between 100 and 250 mL, with agitation and process control features that have become standard on the ambr platform.
Although the value of single-use is not entirely clear at this scale, the scale is sufficient for multiple samplings for offline analysis in addition to controlling for pH, dissolved oxygen, temperature, and agitation. Feeds are supplied through syringe pumps.
According to Mwai Ngibuini, formerly the ambr 250 product manager (and currently head of East and West Africa, process solutions, Merck Group), the product scales easily to pilot- and manufacturing-scale bioreactors.
Early in 2017, Charo Scott, Ph.D., integrated development team leader for biologics at Bristol-Myers Squibb, reported that the ambr 250 provided cell growth and productivity metrics that were comparable to those observed by the ambr developers. In addition to modeling large processes at manageable scale, the ambr 250 also served for clone selection, which normally takes place in bench-top bioreactors.
Bayer HealthCare employs scaledown modeling for therapeutic protein process development and troubleshooting across the development lifecycle. “We use chromatography and membrane adsorbers for viral clearance to demonstrate the capabilities of our processes,” says Ashley Hesslein, Ph.D., Bayer HealthCare’s associate director for process development.
Dr. Hesslein differentiates scaledown via traditional small-scale processing from miniaturized, parallel, high-throughput systems on the milliliter scale and even smaller-volume chromatography, including systems based on microplates: “We work with both types to determine what they can tell us about large-scale processes.” The miniaturized scale is used less frequently for debottlenecking of complex or error-prone operations, which at Bayer HealthCare falls under the category of process improvement.
Robotics plays a significant role in these efforts, for example, during chromatography screening or step-optimization experiments in minicolumns or microplates, and occasionally for process characterization and troubleshooting.
Dr. Hesslein notes that chromatography systems are generally compatible with resins and prepacked columns from multiple vendors. Ready-to-run or prepacked columns are essential since errors that arise from the manipulation of very small volumes can be substantial.
Yet even with Bayer HealthCare’s extensive reliance on scaledown, Dr. Hesslein recognizes that miniaturization does not represent large-scale processes perfectly: “There are physical limits to what is possible at very small scale.”
Another issue, she notes, is the paucity of scaledown models for filtration. Bayer HealthCare has been seeking to shift more of the purification burden toward depth filtration, so the company relies on scaledown models for that step. Unfortunately, one vendor discontinued a particular scaledown product, requiring Bayer HealthCare to negotiate a special supply arrangement with the vendor.
Dr. Hesslein also would welcome more rapid, plug-and-play analytics for characterizing the contents of microreactors or microplate wells. “We’re not looking for something as comprehensive as you’d use for release testing,” she remarks, “but analytics that provides an easy read of critical process parameters would be useful.”
Bio-Rad Laboratories recently reported on a mixed-mode chromatography scaledown method for protein purification. At Bio-Rad, a team led by Xuemei He, Ph.D., R&D manager, chromatography media chemistry, used the company’s Nuvia cPrime hydrophobic cation exchange media, which carries three functionalities: a weak carboxylic acid end group, a hydrophobic aromatic ring, and an amide bond serving as a potential hydrogen bond donator/acceptor. Together, these functionalities make it possible, says Bio-Rad, for the media’s users to purify proteins with “unique selectivity and good conductivity tolerance … under gentle conditions.”
The test system consisted of 100 μL of Nuvia cPrime resin packed in Mini Bio-Spin columns, with sodium chloride binding and eluting buffers at pH 4 to 8. For this experiment, Bio-Rad used common model proteins, such as bovine serum albumin, bovine carbonic anhydrase, and conalbumin.
“We wanted to show that, regardless of the isoelectric point or molecular mass, and with limited knowledge of the protein’s structural complexity, the interaction of the protein with a mixed-mode chromatography media within specified pH and conductivity ranges could be revealed through quick screening,” explains Dr. He. “The same strategy can be applied to therapeutic proteins.”
For analyzing a limited number of replicates, Dr. He used statistical software from JMP, which sells packages for genomics and clinical applications as well as predictive modeling software.
Dr. He notes that the design-of-experiment (DOE) approach works with mixed-mode chromatography with “a diverse set of proteins,” providing, with just 11 experiments, conditions for optimal purity and recovery. Finally, he concludes with an even more germane point: “Following the operational conditions suggested by DOE, we were able to scale up the purification of lysozyme from crude expression harvest with remarkable purity.”
A New View
The various platforms for upstream and downstream scaledown modeling beg for some sort of framework for their utilization. It’s convenient to delineate upstream and downstream operations, or to add granularity by imposing the boundaries of individual unit operations. Surendra Balekai, senior global product manager for single-use technology at Thermo Fisher Scientific, offers an alternative view based on the components of a bioprocessing effort, namely process equipment, automation and control, and the process itself. This approach seeks a systematic examination of the vulnerable points within the process lifecycle that transcends individual steps.
At the forefront of potential equipment issues is the bioreactor itself. “Is the design appropriate?” Balekai asks. “Is the height-to-diameter ratio appropriate for the process, the sparging, impeller type, and positioning? Is the jacket for heating and cooling efficient enough for this scale? Does the impeller motor provide a sufficient power to volume ratio?” Because some of these parameters replicate poorly duplicate at small scale does not mean they should not be considered.
Balekai has always appreciated the value of automation and control, an interest that recently deepened with Thermo Fisher’s acquisition of Finesse Solutions in February 2017. A Thermo Fisher technology partner since 2013, Finesse developed its universal control systems initially for laboratory-scale flexibility through large-scale single-use production, and these systems currently serve Thermo Fisher’s scaledown interests as well.
In December 2016, Finesse launched its second-generation G3Lite+ SmartController single-use bioreactors, extending the capabilities of that product line to all major single-use bioreactor brands, from 50 L to 2,000 L in operating volume and single-use fermenters up to 300 L. The G3Lite+ controller accommodates up to four peristaltic SmartPumps locally, or seven pumps when connected to an external pump tower. It also incorporates Finesse’s SmartTransmitters for TruFluor single-use sensors or electrochemical sensors.
Investigations of instrumentation and automation lead to further questions about their type and suitability: Does the bioreactor use electrochemical or single-use sensors, and are they user-calibrated before deployment? Do the sensors provide reliable performance, or do they drift, requiring daily one-point calibration?
“Even more significant is whether the controller properly acquires sensor data, which is necessary for providing responses appropriate for optimal control,” Balekai points out. For example, the sensor may report a pH of 7.1 correctly, but the display may show some other value. Another source of misappropriation of sensor data involves the deployment and turning off of the impeller motor.
If operators have come this far without noting any particular issues, it is time to consider process parameters that are, at least theoretically, the most amenable to downscale-type trouble-shooting. Yet all play somewhat into instrumentation and control as well.
Dealing with metabolic gases, technically a process issue, is somewhat complex as it draws on equipment and control as well. Ameliorating the accumulation of carbon dioxide from metabolic gases can be achieved either through stripping (instrumentation plus control) or adding alkali (control plus process). Feed issues are also a process issue but rely on precise measurement of feed component concentrations and consumption over time.
“Process consistency is a big issue, particularly during the early stages of cell culture, when cells are doubling,” Balekai notes. Cell proliferation, a parameter that is relatively easy to monitor, should be smooth and predictable at this point. Additional concerns involve the use of oxygen or oxygenated air for metabolism control, and replenishment of nutrients during fed-batch processes.
As noted, process-related parameters, particularly those related to cell culture, are the most critical and, not surprisingly, the best covered by scaledown modeling. Scaledown troubleshooting is often employed when data from consecutive days or batches do not match. It works both ways, Balekai says. “You can get good efficiencies at small scale and even better performance at large scale. Or it can be worse, or different. Where results are not what you expect, you must return to the scaledown model to re-establish the process.”