|Send to printer »|
Feature Articles : Jun 1, 2012 ( )
Downstream Bottlenecks More than Just Perception
One can barely discuss upstream bioprocess productivity without mentioning downstream process bottlenecks and their close cousin, the capacity mismatch. As my colleague Gail Dutton wrote in GEN in 2008, “the answer to whether a downstream bioprocessing bottleneck exists is largely one of perception.”
Unlike the Easter Bunny, bottlenecks do exist, as is amply demonstrated by product hold tanks and the various activities and accounting entries associated with waiting. Where perception enters the picture is in assessing whether purification bottlenecks are unusual, unexpected, unmanageable, or adversely affect product quality. On those issues bioprocessors are firmly in control. That does not mean upstream-downstream mismatches are not still a challenge.
Slowdowns typically occur with low-capacity operations or those with high residence times. By analogy to rate-limiting operations in a chemical reaction, downstream bottlenecks represent the “slow step” in the overall process. The reason they occur at all downstream has more to do with the purification train design and process economics than any innate shortcoming of a particular step.
Scheduling issues also factor in. An increase in volume or titer through upstream productivity enhancements may require, for example, round-the-clock scheduling where the norm may be two shifts. While adding a shift is an option, a more likely solution would be to hold product while waiting for the last bit of material to wind through. This in turn adds to concerns for product stability, monitoring and quality testing, the availability of holding tanks, and buffer prep/storage.
“Usually plants have a certain footprint and have to make the process fit within it,” observes Chris Gallo, Ph.D., of Pfizer Bioprocess R&D. “We’re always cognizant of costs of raw materials, buffers, resins, and processing times, but product quality comes first, drives everything else.”
Evolutionary vs. Revolutionary Progress
The “classical” view of purification bottlenecks caused by upstream-downstream mismatches retains validity in many cases, at least operationally.
Most bottlenecks occur as a result of equipment or facility limitations, says Sa V. Ho, Ph.D., senior research fellow at Pfizer Biotherapeutic Pharmaceutical Sciences. For example, “if, over time, through optimization of cells and upstream processes, the titer rises from one or two grams per liter to four or five grams, you will find yourself with two or three times as much product to process from the same volume of broth.”
Downstream operations that are limited in terms of capacity or physical layout, or designed specifically for low-titer cell cultures, are the hardest hit.
Within the downstream process itself, specific unit operations tend to be limiting, particularly those already operating at full capacity. Chromatography columns, for example, typically have a limited binding capacity range within which they are optimized for product purity and yield. In cell culture processing, protein A capture resins for monoclonal antibodies have improved in terms of binding capacity, but their expense complicates their efficient utilization.
“You would think that debottlenecking a capture chromatography step would be a simple matter of buying more resin and filling a larger column. But many facilities cannot accommodate much larger columns or the required buffer preparation and storage. Some facilities simply cannot withstand the total weight of a two-meter column, for example, even if resin cost were not a factor.” Dr. Ho explains.
Similarly, managing high downstream expectations in light of upstream productivity is “the current issue” for Thierry Ziegler, Ph.D., who heads bioprocessing and core platform efforts at the Sanofi Vitry, France facility.
Dr. Ziegler holds that “evolutionary” downstream improvements have not kept up with protein titers from cell cultures. “If you look back ten years, tools for mAb purification consisted mainly of protein A, ion exchange, and hydrophobic interaction chromatography. Today the new resins have improved capacity and functionality, but there was no quantum leap in how much protein that could handle. Downstream processing simply has not progressed to the same degree as upstream.”
Resin vendors argue, with justification, that performance improvements have been substantial. Protein A resins bind 50–100% more protein per volume of resin than a decade ago, and improved resistance to cleaning/sanitizing agents allows running more cycles than ever. On the ion-exchange front, new resins, ligands, and matrices have improved capacity, flow rates, and selectivity.
Mixed-mode ligands have been around for many years (the original mixed-mode medium, hydroxyapatite, was discovered in the early 1900s) but have only recently been recognized for their exceptional selectivity and binding capacity, and their potential for reducing process steps.
Mixed-mode columns have enabled Sanofi to compress mAb purification from the traditional three-column process to a two-column scheme consisting of mixed-mode and ion-exchange columns while providing equivalent yield and product purity.
Ion exchange works well as capture and intermediate chromatography for many recombinant, nonantibody proteins because of the wide variation in the proteins’ isoelectric points (pIs). Ion exchange is less effective for monoclonals, but mixed-mode columns are increasingly viewed as alternatives for both capture and impurity removal.
No one is intimating that protein A is on its way out. “Protein A cost is counterbalanced by the very high purity of mAbs they help generate,” says Dr. Ziegler. “What some critics fail to realize is that highly efficient and robust capture means producers need to invest less in operations succeeding protein A capture. If they began with cation exchange, they’d need to invest more in steps downstream of capture.”
Upstream productivity has had far less impact on removing residual impurities than on capture and intermediate chromatography. The throughput of “polishing” columns, which is run in flow-through mode, depends more on impurity levels—which remain relatively low even in high-titer processes—than protein concentrations.
Many companies, Sanofi included, have replaced anion-exchange polishing with membrane chromatography to reduce processing time and lower costs related to buffers and resins. Not every purification is achievable in flow-through mode, and membrane adsorbers are not competitive when used in bind/elute mode.
Ultimately what are referred to as “bottlenecks” come down to complex problems of economics rather than straightforward capacity logjams. In a pinch, processors can always enlarge specific unit operations, redesign downstream facilities, or increase daily work shifts from two to three to eliminate bottlenecks. These considerations are always balanced against facility utilization, cost, and product yield and purity.
Or one can simply build the anticipated downstream capacity into a new facility like Innovent Biologics. Innovent, which develops biosimilars for the Chinese market, is setting up shop in Suzhou, China, where it is headquartered. COO Scott M. Wheelwright, Ph.D., says that despite increasing yields from the upstream side, “we can certainly design a plant to handle the purification” without great concern for bottlenecks. “For us the bottleneck will be the bioreactor.”
Starting from a clean slate allows facility designers and process engineers to consider process intensification and streamlining such necessities as loading, washing, elution, buffer preparation, storage, hold, and equipment utilization. Cost and scheduling exercises become infinitely easier with a blank sheet of paper in front of you. Plant utilization, Dr. Ho says, is a more serious economic issue than most people realize; scheduling and staggering unit operations to assure that bottlenecks operate at full capacity is one work-around.
It’s the Economics
Wheelwright subscribes to the “economics theory” of bottlenecks, which particularly affects legacy processes with fixed resources. “You can put in a large enough column to run a batch in a single cycle, or you can use a smaller column and multiple cycles.” Downstream processing, under this model, becomes a trade-off between the materials and facility requirements of larger equipment with the time required for multiple cycles.
“The issues are most critical for CMOs operating older plants,” he says. “Due to upstream productivity they can now bid on a job that requires a smaller bioreactor, but they have problems processing all that protein.”
Most new facilities, he notes, use process simulation to understand those trade-offs before the plant is constructed. Regardless, there is no easy answer to the apparent lack of economy for clinical batches. Processes optimized for commercial manufacturing attempt to squeeze every ounce of value from membranes and resins. At clinical scale, where time is critical, most firms never realize the full potential of expensive media, whereas during production the cost of high-priced materials and equipment are amortized over many batches.
The very nature of downstream bottlenecks arise from the cost of purification media, notes Sarfaraz K. Niazi, Ph.D., chairman of Therapeutic Proteins International. “Protein A costs $17,000 per kilo, and a large process may use hundreds of kilograms.”
His company, which manufactures monoclonal antibodies and cytokines in bacteria and animal cells, has simplified protein purification by expressing and purifying conducting expression and purification in one container. Therapeutic Proteins was awarded a patent in early 2012 for the fully contained process vessel that is 100% disposable, and will soon introduce a 5,000 L model.
The trick is harvesting protein instead of cells or supernatant. Protein A added to the fermenter adsorbs product while the cells and broth are skimmed off. Product is subsequently eluted off the resin and subjected to further purification steps. This invention is part of the U.S. Department of Defense’s “medicine on demand” initiative. “We can save about 50 hours of process time per batch,” Dr. Niazi says, “while improving yield and lowering costs.”
According to Anthony Newcombe, Ph.D., who heads process and product validation for GlaxoSmithKline Biologics, quality by design (QbD) applied early in development can help bioprocessors avoid bottlenecks. QbD traditionally aims at improving process knowledge. When adopted early enough, it can help to broaden the design space so that problems like higher than expected titers may be addressed without panic.
“QbD allows you to assess process risk early, to evaluate key quality attributes and critical process parameters earlier in process design, and to help understand process robustness,” he says. Understanding these factors provides “wiggle room” for facing the unexpected confidently, for example by building in extra capacity. It also works from the opposite direction by providing understanding of less-than-robust operations.
QbD has been around for several years, but smaller companies remain skeptical about its potential rewards, says Dr. Newcombe. “It’s not fully embraced industry-wide, and it will take a few more years before regulators and industry align on QbD’s requirements.”
Bioprocess bottlenecks, while hardly product-killers, are a genuine concern for established processes. Biotech companies and their suppliers have done a tremendous job of meeting mismatch challenges and will continue doing so for as long as upstream productivity continues climbing.
“A fair amount of work still needs to be done on the downstream side,” says Dr. Wheelwright, “but I’m confident that companies and academic groups involved in research and technology development will come up with good solutions to resolve bottlenecks and reduce the cost of downstream processing. I’m looking forward to the next five years.”
© 2016 Genetic Engineering & Biotechnology News, All Rights Reserved