November 1, 2014 (Vol. 34, No. 19)
Angelo DePalma Ph.D. Writer GEN
Biopharmaceutical production technologies, operations, and quality systems are undergoing a paradigm shift according to William Whitford, senior manager of HyClone cell culture at GE Healthcare Life Sciences.
“The drivers include the expiry of biologic patents, increased demand for personalized medicine, and greater global competition.”
Most experts now credit cell culture optimization as the leading factor in rising titers and quality/efficiency improvements during the last 10-plus years. However, new processes still require extensive development work, and processors constantly look for tweaks in existing processes.
Suitable preformulation media may not exist for emerging production platforms, including new avian and human cell lines and multiple recombinant null clones of various origins, according to Whitford. “New, innovative process strategies that can influence media composition are becoming more prevalent even though fed-batch will remain the industry workhorse.”
Some of these strategies include continuous biomanufacturing and transient transfection. Production modes (such as intensified perfusion and fed-batch) and even specialized equipment (such as perfusion retrofits, packed-bed reactors, and single-use systems) impose their own demands.
In addition, new product categories such as bispecific mAbs, F(ab)s, enzyme replacement therapies, coagulation factors, and bioconjugates are driving changes in media design. Other new biologics demanding specific media optimization include the emerging cell-based auto/allo/xeno therapies, bioartificial tissues or organs, and animal cell-derived vaccines.
The modern business imperatives of outsourcing, globalization, and local national supply (as well as their implications for novel vaccines and biosimilars) require fresh consideration of culture media economics. Biomanufacturers also feel pressure because of rising standards for raw materials, product purity, packaging, and cold chain performance. To meet expectations, bioprocessors are rethinking their approaches to analytic testing, instrumentation, and formal risk analysis.
Each of the above, notes Whitford, places novel demands upon production media and feeds, and he adds that cell biology and analytics are critical for supporting the development of appropriate media formulations. “A clear culture media development strategy [is needed],” he insists, whether it is “executed either in-house or in close collaboration with an expert cell culture media provider.”
The Material Is the Message
How materials science affects cell cultures remains an issue 15 years after the debut of single-use bioprocessing. All vendors today provide a dossier on plastic process containers, including detailed information on materials of construction—particularly for product- or critical fluid-contact surfaces. With the release of every new contact surface resin, however, concerns arise over leachables that might reduce cell growth, yields, or even drug safety.
Researchers from Sartorius Stedim Biotech recently published a paper focusing on potentially cytotoxic leachables from a newly introduced polyethylene multilayer film, S80. The S80 resin was developed for a range of process applications including storage, mixing, and bioreactors. Sartorius Stedim Biotech employs optimized S80 in its FlexSafe® line of single-use products, and Amgen is a customer.
The concern over potential toxicity arises from the manufacturing process involved in producing the disposable containers. “For plastics to withstand process steps such as extrusion, welding, and gamma irradiation and still retain desirable properties requires addition of an additive package,” says Magali Barbaroux, Ph.D., director of R&D for film and materials at Sartorius Stedim Biotech.
Single-use suppliers must select additives that first optimize properties of the bag or container, then work within variations on that additive package to minimize or eliminate harm to the cell culture. “It becomes a balancing act between performance as a film and performance as a pharmaceutical container,” explains Dr. Barbaroux.
After using a common cell growth assay and chemical analysis to test for leachables from prototype films, Sartorius Stedim Biotech concluded that S80-based cell culture bags did not pose a threat to cell cultures. Yet the company’s exercise underscores the increasingly eclectic sets of expertise required to introduce a new product into this competitive market.
Role of Analytics
Since biomanufacturers began chanting the mantra of process analytics, the goal has been analyses that are faster, more accurate, indicative of key process parameters, and as close to real time as feasible. Several assay platforms have emerged to compliment efforts at inline or at-line instrumental analysis.
One of the more versatile systems is PerkinElmer’s LabChip GXII Touch. It is based on traditional gel electrophoresis, but it has a microfluidic chip-based format. By scaling down sample volumes to picoliters, LabChip GXII reduces separation channel length and thereby analysis time. Denatured samples load automatically from a microtiter plate, mix with a normalization marker in the channels, and undergo staining and destaining on the chip.
“Although the platform supports both nucleic acid and protein assays, most researchers focus on one or the other,” observes Richard P. Bunch, director of the microfluidics product portfolio at PerkinElmer. Process developers optimizing cell culture conditions can employ a suite of protein characterization assays designed for the instrument, including one for glycan profiling.
According to Bunch, LabChip improves on conventional SDS PAGE gel analysis with higher throughput, a higher level of automation, direct-from-microplate processing, and less waste. Depending on the culture and the assay kit, LabChip provides insights into protein purity and fragmentation, N-glycan profiling, recovery/yield, fragmentation/size, relative concentration, purity, titer, and change variant.
“LabChip GXII is applicable across the entire biotherapeutic workflow,” Bunch continues, including analytical development and quality control. “One benefit of serving multiple areas along this workflow is continuity of methods and the transfer efficiency of a molecule across developmental boundaries as it matures to a commercial product.”
For example, the same platform type becomes available for optimizing cell culture conditions for protein yield, stability, and purity. And its rapid turnaround allows more complex experimental matrices during cell culture optimization when mapping the variable space. “Then,” notes Bunch, “the same platform can be used to also optimize purification protocols and formulations.”
The Data Question
The adoption of inline or at-line analytics during cell culture development and beyond generates quantities of complex data that were unimaginable a decade ago. Electronic laboratory notebooks (ELNs) and laboratory information management systems (LIMSs) have not completely solved the data problem. Traditionally, neither system has been capable of both capturing and analyzing critical analytic and process data.
Jarrod Medeiros, senior consultant at data analysis firm IDBS and a former upstream processing engineer at a major pharmaceutical company, has observed data proliferate ever more abundantly throughout his career. Development groups run more cultures than ever, often in parallel, thanks to microbioreactors, coupled with an appreciation for scaledown studies and the multiplicity of angles from which analysis is conducted.
“More runs generate more data, for example, from metabolomics and glycoprofiling,” Medeiros notes. “There is a great need for analysis tools that bring this data together meaningfully, to provide users with access and the ability to compare and link that data.”
Medeiros is fond of the term “geneology” to describe the stage of a development project from which the data arises, and how this data links to the critical quality attributes that constitute a glycoprofile. His company’s approach also incorporates historical data from disparate runs across the entire development process.
“You might notice that the pH had been a bit high in one run, and that condition may have affected the glycoprofile,” Medeiros explains. “Traditionally, someone would use a spreadsheet and paper and manually input bioreactor data. Half the time, results would be emailed, which required cutting and pasting.”
Newer approaches, however, are more convenient. Most instruments today integrate directly into data system that store data points throughout a development project. This allows pinpointing in time and drilling down into instances that represent positive or negative excursions from optimal performance.
Medeiros describes the IDBS product, E-WorkBook Suite, as a hybrid software package: an ELN with some LIMS-like functionality that operates like a process execution system. One customer, Lonza Biologics, has noted that unlike LIMS that are rigidly designed for specific workflows, E-WorkBook Suite is flexible and more suitable for biologics development.
Optimizing from the Ground Up
The Apollo™ cell culture platform from Fujifilm Diosynth Biotechnologies represents one of the more ambitious attempts to redefine cell culture from the bottom up. Apollo incorporates a cell line, expression vector, cell-line development process, and basal and feed media suitable for single-use and fixed-tank processes. The platform incorporates novel and proprietary components.
Fujifilm’s revamping of all suboperations begins with the host cell line and vector. “If you want a robust, high-quality, highly producing biomanufacturing process, your starting point must be right,” says Alison Porter, Ph.D., head of mammalian cell culture R&D at the company.
The host cell line, produced through directed evolution rather than direct engineering, exploits characteristics inherent in the original cells. Through this process, developers apply environmental stimuli to select subpopulations. Specifically, developers look for a host cell line with reduced doubling time, an ability to grow to high viable cell densities, and a capability to achieve high expression levels.
The first run-through with the eventual new host cell line and vector produced therapeutic proteins at 3 g/L unoptimized. “This is about what most companies moving toward Phase I look for as a good starting point,” Dr. Porter notes. “Productivity can improve even more with additional development.” The new DHFR selectable-marker vector incorporates a proprietary leader sequence to allow efficient expression of a recombinant protein.
Fujifilm paid close attention to cloning, which has become a point of concern for regulators who insist on monoclonality. “You don’t want to get into late phase with a product and have regulators questioning clonality of the recombinant cell line,” Dr. Porter explains. “We have experimental and statistical data to support our cloning methods and how we calculate the probability of monoclonality.”
On the media/feed side, the company focused on basal media and nutrient feed media that work with normal glucose or glutamine supplementation, which brings expression up to the 4–5 g/L range at high cell density and product quality.
In redesigning large-scale cell culture, Fujifilm takes a thorough but practical approach to process development. During early-phase studies, developers focus on parameters that matter and are difficult to control.
“We don’t need to study mixing or sparging speed, [which are] components of our scaleup-scaledown platform that we’re already quite familiar with,” says Stewart McNaull, director, of development and technical services at Fujifilm. Instead, he focuses on how much and when to feed, pH schemes, and other less straightforward process parameters.
At later stages, more comprehensive design-of-experiment methodologies—encompassing microbioreactors (particularly the ambr™ systems from TAP), small bioreactors, and the usual statistical and process control software—are applied to achieve a full process understanding. These are coupled with robust scaledown models that produce data that are truly representative of manufacturing scale.
“Optimization should be phase-appropriate,” McNaull adds. Thus far the company has applied these late-stage, quality-by-design methodologies to 24 programs.
Modern cell biology research dates back to the early 1950’s when it first became possible to maintain, grow, and manipulate cells outside of a living organism. During the same time period, a scientist named Ernest Polge discovered that living cells and tissues could be preserved through the use of very low temperatures, and the field of cryopreservation was born. Today, the art of preserving and manipulating cells for therapeutic purposes is one of the fastest growing areas of modern science.
Cells are fragile and can be sensitive to even minute changes in temperature. Most of the temperature variability that plagues cell-based research arises during sample handling and results in low cell survival and viability, lack of reproducibility and even apoptosis. In the field of cell-based therapy, lack of reproducibility is arguably an even more serious problem, since it translates directly into changes in efficacy and dosage of the drug product.
Cell-based research is also hampered by loss of sample integrity, which often occurs during the cryopreservation process. Cryopreservation protocols contain many modifications based on the cell type being preserved, but in general the process has been based on the precept that the ultimate survival of a frozen and thawed cell is dependent on the cooling and thawing rates. It has been shown that standardized methods of freezing cells at the optimal rate of -1°C/min, and preferably without alcohol or other variables, and thawing cells at higher temperatures (and therefore faster thawing rates) results in higher post-thaw cell viability.
Temperature control during collection, isolation, and preparation of cells is crucial, but even when those dynamics are well controlled, cells are exposed to other temperature-related hazards once they are ready to be transported and stored. Shipping is typically a weak point, since packages are often under the care of a contract company with little in-depth knowledge of biological materials, and lack the required temperature monitoring. Similarly, when samples are transferred from large freezers to automated processing systems for long term storage, temperature fluctuations must be monitored to avoid transient warming events.
While there is still a long way to go, efforts to improve temperature standardization and reproducibility are gaining momentum. Fluctuations in temperature are being remedied by innovative temperature control technology, improved oversight, and standard operating procedures (SOPs) with stricter temperature requirements. Continued increased awareness of the unique importance of temperature management to the field of cell biology will bring a brighter future to both research and patient care.