May 15, 2015 (Vol. 35, No. 10)
Upstream’s Relatively Continuous Flows Raise Hopes for Continuous Purification
Perfusion culture has been employed for at least 20 years in bioprocessing, but it is now catching on as the upstream continuous unit operation.
Repligen was betting on this when it acquired ATF (alternating tangential flow) technology from Refine Technology in June, 2014. Repligen’s ATF systems director John Bonham-Carter was a principal at Refine and worked with ATF’s inventor, Jerry Shevitz, Ph.D.
Important factors slowing adoption were perfusion’s complexity, unreliability, and unsuccessful early deployments at manufacturing scale. As a result, no equipment vendor could gain a foothold in the market. That has changed positively with the introduction of improved cell retention devices and bioreactors with high oxygen transfer capabilities.
The ATF replaces the centrifuge and depth filter with a single, continuously operating device that simplifies scaleup and technology transfer, producing a cell-free harvest stream. ATF enables cell densities of over 80 million cells per milliliter—concentrations that are impossible to achieve through batch culture or with previous-generation perfusion devices.
Bohnam-Carter believes that single-use processing equipment paved the way, philosophically speaking, for continuous processing: “It has made people who were averse to new technologies more confident in adopting them. Replacing mixing tanks and reactors with bags is no longer considered radical, but it required a mindset change.”
He notes that just eight years ago, no one was interested in perfusion culture for large-scale manufacturing. Today, every bioprocessor must at least consider perfusion, if only for new processes and facilities.
Perfusion is arguably the leading technology for upstream processing. “It’s not just continuous, it’s intensified by virtue of cell density,” continues Bonham-Carter. “It’s simple because you’ve eliminated equipment, and it’s reliable in terms of constant concentration, which allows upstream and downstream to run well together.”
No Downstream Star
Downstream processing lacks continuous operations on par with perfusion culture, specifically off-the-shelf, reliable, multicolumn chromatography systems that a less skilled downstream operator can use relatively easily. “Continuous downstream operation is at the early stages of growth and acceptance, like ATF and perfusion were eight years ago,” Bohnam-Carter comments. Key to moving forward will be filling operational gaps, improving reliability, and making a strong case for continuous chromatography.
Although yield considerations have historically dominated upstream discussion, Bonham-Carter raises the prospect of continuous processing enabling greater consistency and, therefore, quality.
An oft-quoted criticism is that in continuous processing, batch designations become nebulous, which will adversely affect quality rather than improving it. “It’s important to understand this,” Bonham-Carter counters. “The FDA readily accepts defining a batch by either volume processed (identical to fed-batch) by product mass produced, or even by time. It’s your choice.” But as long as conditions are tightly controlled, and batch reproducibility is demonstrated, the prospect of generating large quantities of high-quality product over a two-month culture without start-up and shut-down interruptions is another reason to consider continuous processing.
Semi, Quasi, Pseudo?
According to William Whitford, senior manager, cell culture, GE Healthcare Life Sciences, many of the benefits of continuous processing are achievable outside of strictly continuous, seamlessly linked upstream and downstream operations. Numerous possibilities exist along the continuum between batch and continuous. These include semi-continuous and pseudo-continuous unit operations, or batteries of directly connected batch-type operations.
Enthusiasm for continuous processing exists on its own continuum as well. “On one side are people who speak glowingly about it, who will point out that most other significant products are made through continuous manufacturing,” Whitford says. At the other end are those with serious concerns, who fret over the definition of “batch” and how to handle recalls, who generally mistrust the technology. Somewhere in the middle, Whitford says, are companies “steadily working on continuous processing, approaching it systematically and hoping to achieve it within a realistic timeframe.”
Current intensified biomanufacturing approaches include intensified batch, intensified fed-batch, intensified perfusion, and others. If these approaches were represented by Venn diagrams, they would overlap—or so Whitford suggests with a hypothetical question: “What do you have if you hook up a hollow-fiber filter to a stirred-tank bioreactor, grow the cells to high density, and periodically harvest product while exchanging the medium? Is that batch, fed-batch, or perfusion?”
Additional approaches, factors, considerations, activities, and complexities include dialysis, extraction, steady-state, internal concentration, internal/external filtration, chemostat, repeated batch, and even attached continuous cultures.
“They all overlap,” Whitford insists. “You could devise a platform employing two of those ideas and have a brand-new category. You can use perfusion to intensify a batch culture, or use it to establish a chemostat. Or you can design a perfusion and continuous process that is not chemostat. To the person who hasn’t thought about it a lot, each are individual, separate approaches. But as you look closely there’s a lot of crossover.”
It is possible even to have “almost continuous” processing using batch reactors. If a culture takes seven days to mature and seven tanks are used, then downstream operations will see a daily, semicontinuous flow of harvest. “It becomes a matter of semantics,” Whitford asserts.
Automation and computerization are factors as well. For example, inline process analytics rely on autosamplers, real-time analytics, and feedback control. According to Whitford, SCADA (supervisory control and data acquisition) or some other form of high-level enterprise control may be relevant where a computer not only monitors raw material inventories, upstream material preparation, upstream and downstream operations, and packaging, but also controls related activities. “If the process runs out of media, the computer issues an alarm and decides how to handle intermediates,” Whitford explains.
GEN readers are by now familiar with ongoing efforts to standardize certain aspects of single-use processing, particularly with regard to leachables and extractables. Any consensus reached within the next few years will apply to current batch processing times, which typically run 7 to 14 days. Since continuous upstream processing might proceed for several months, leachables and extractables standards drawn up in 2015 or 2018 will likely not apply.
On the other hand, media residence time is much shorter in perfusion culture. Media formulated for batch or fed-batch processes may be overkill—too rich in nutrients and too expensive—for processes that continuously replenish media.
“Talking about soup-to-nuts continuous processing is useful, but right now it’s a dream,” Whitford advises. “Today, bioprocessors are limited to considering a perfusion cell culture upstream or, in even more limited cases, simulated moving bed or other multicolumn chromatography downstream.”
Lacking a robust downstream component, realization of fully continuous end-to-end bioprocesses will take a few more years. Such processes will require bringing upstream and downstream operations closer together, and perhaps connecting production and purification more closely than they are today.
In Bonham-Carter’s view, advances in perfusion culture will facilitate and accelerate acceptance of continuous purification, and perhaps inspire developers’ vision of what is achievable within a few short years.