|Send to printer »|
Feature Articles : Jun 1, 2013 ( )
Strategies to Optimize Cell Culture
With the worldwide appetite for biologics soaring—the U.S. alone accounts for roughly $100 billion annually, and that’s growing at 11% CAGR—efforts to optimize bioprocessing technology remain vigorous and varied.
Faster assays for cell-line screening, small-scale modeling to predict large-scale production effects, and new microfluidics able to finely control microenvironments are just a few of the many technologies on the horizon.
One of the most pressing concerns for all companies manufacturing biologics is preventing viral contamination; raw materials constitute a considerable portion of that risk. Switching to different raw materials to mitigate the risk is often done, particularly the migration away from materials of animal origin. The problem, however, is difficulty predicting the effects of these changes on legacy processes. Sofie Goetschalckx, Genzyme’s manufacturing cell culture science lead, technology division, discusses her firm’s practices.
“Reducing viral risk is our primary goal in changing materials at the moment, although sometimes we assess a second supplier so if one company goes out of business or can no longer provide the product, we will still have a supplier who can produce the material,” says Goetschalckx.
“We’ve developed qualified small-scale models. So we have a 10-liter cell culture model that is performing similar to our ‘at-scale’ 4,000 liter bioreactor, and we have a downstream model that performs similar to our overall product quality generated at scale,” she adds. “We take three different lots from the supplier so we have some variability in the raw material and run the small-scale models and see what the impact is.”
Building these small-scale models takes two to three years, according to Goetschalckx. One of the most problematic issues is controlling PCO2. “At small scale, PCO2 is completely different than at scale and has a huge impact on cell growth and recovered product quality. Our cells tend to grow a little better at scale than at small scale.” Genzyme adds more CO2 to the small-scale model process to compensate.
Sampling size also has an effect. At small scale the effect is much larger and more significant than at scale. “It turns out there are some differences that are challenging that you can solve but some you cannot solve,” she notes. Depending upon the results and the risk assessment on the criticality of the material, “we determine if we need additional data at scale in a kind of engineering run. Ultimately, this package of data will be reviewed by laboratory quality and also regulatory authorities to assess if you need to resubmit. Resubmission is very rare.”
Sialic Acid Content Analysis
Sialic acid (SA) is an important component for therapeutic proteins; it can prolong serum half-life, influence the biological activity, and improve solubility of proteins. Lam Raga Anggara Markley, Ph.D., scientist, Biogen Idec, discusses an improved high-throughput assay for SA content (high-throughput total sialic acid assay (HT-TSA)) that was developed at Biogen and based on his earlier work at MIT. One important application is faster screening of clones when thousands must be analyzed to winnow down possibilities for a particular bioprocess.
Several factors affect SA content analysis, among them intraclonal variability as well as production parameters and instability. Currently HPLC is most often used to measure SA content, but the process can take five days.
The new high-throughput method (HTM) is more accurate than the earlier version and “takes about 70 minutes, and we can analyze more than 100 samples in one day,” says Dr. Markley. Speed is the main advantage. “The downside is that HTM cannot distinguish Neu5Ac (N-Acetylneuraminic acid; NANA) from Neu5Gc (N-Glycolylneuraminic acid; NGNA) because they have the same fluorescence spectra. HPLC can distinguish the two.”
The assay method consists of four steps:
“People may wonder when they should use HPLC and when to use the HTM assay. As a general rule, use HPLC when you don’t have a large number of samples and you need to know the level of the two acids. If you have hundreds of samples and need the data quickly, and don’t need to know both levels, you can use the high-throughput method,” Dr. Markley explains.
Microfluidic cell culture offers many distinct advantages, not least that scientists can work with expensive rare cells, use less reagent, and in many cases control the environment with greater accuracy than in bulk cell culture, petri dishes, or multiwell plates. Samuel Forry, Ph.D., research chemist, National Institute of Standards and Technology (NIST), talks about technology developed at his institution.
“At NIST we’re very interested in fundamental measurements and the way measurement can improve biology. I think we have demonstrated a level of control and ability to measure the microenvironment around cells, which is revealed in sensitivities to partial gas pressures that may not have been appreciated,” notes Dr. Forry.
To accomplish this, Dr. Forry and his colleagues have fabricated the microchamber and microfluidic compartments out of gas-permeable material—poly(dimethylsiloxane)(PDMS)—and routed them near each other. “We can then allow diffusion through the material to give us control over the partial pressure in the culture chambers where the cells are being cultured. We’ve shown this for oxygen and for CO2.”
Essentially, they create stagnant conditions but are able to supply sufficient gases to keep the cells from disrupting the homeostasis in the environment. “We perfuse media past the cells but do it only intermittently,” he notes. “It turns out conventional cell culture media has enough salts, amino acids, and sugars to last for a pretty long time, but the gas partial pressures get out of whack quickly. We control the gas partial pressures directly through the material.”
The most common use, according to Dr. Forry, “is where you want to create a hypoxic environment or maintain 5% CO2. We can also create gradients across a microfluidic chamber to create systems where one side of our chamber has normal oxygen level or maybe 21% and the other side has hypoxic conditions and one can look at the way cells respond in the two different environments side by side.”
Rethinking Shear Sensitivity Stress
Shear sensitivity is an often-cited cause for bioreactor failure. Jeffrey J. Chalmers, Ph.D., professor, chemical & biomolecular engineering, Ohio State University, poses questions about the validity of this worry.
“Shear sensitivity is misused. It’s a bad term to start with. It’s used as an excuse when people don’t know why things aren’t working well,” says Dr. Chalmers, noting how the idea that cells could not be grown in suspension was held with equal faith 25 years ago and proved dreadfully wrong. For example, “When no surfactant was used to prevent cell adhesion to interfaces, people used to blame death like that onto mixing when it was due to other interactions. If the cells are in suspension, not attached to a microcarrier, they are pretty tough,” he insists.
No doubt it’s possible to have clones that are more susceptible to thermodynamic forces, he says, “but that’s not usually observed in the bioreactor because that’s a pretty gentle area. Downstream processing is where people can see problems.” Dr. Chalmers’ group has developed a device to help companies test clones to determine how sensitive they may be to specific dynamic forces.
Matrix-Free 3D Spheroid Technology
Eight to ten years ago, “we said we’ve got video cameras, we’ve got money, whole animal research will come to dominate,” says Mark DeCoster, Ph.D., associate professor, biomedical engineering, Louisiana Tech University (LTU). Since then budgetary constraints have changed the rosy picture. “It’s forced us to become more realistic. So while 2D cell culture won’t go away anytime soon, an intermediate 3D option is needed.”
“We have established a novel matrix-free 3D cell spheroid system that permits growth and maintenance of normal cells, stem cells, and cancer cells,” continues Dr. DeCoster. “In addition to processing of soluble drugs, we are also using our 3D system to evaluate bioprocessing of micro- and nanomaterials.
“We have measured binding and internalization of these materials as well as toxicity of nanomaterials. It is anticipated that 3D systems will provide new information for materials bioprocessing compared to traditional 2D cell culture systems due to differences in diffusion and cell-cell communication.”
His LTU lab concentrates on neuroscience. “Many brain tumors in the body have regions, a core that died, while the outside is growing. What’s quite satisfying to us and others too is that spheroid models recapitulate that; after they reach a certain size, the limits on glucose and oxygen’s ability to enter the spheroid cause it to have a necrotic core, yet the spheroid still grows because there is sufficient oxygen and glucose on the outside. Our system is excellent for observing these kinds of activity.”
He founded a startup—Nanogaia—to commercialize the technology. An important remaining challenge, he notes, is developing needed IT technology: “Storage has become incredibly cheap; the question is algorithms and software to turn the pretty pictures into meaningful things.”
© 2016 Genetic Engineering & Biotechnology News, All Rights Reserved