|SEND TO PRINTER|
Feature Articles : May 1, 2010 ( )
Leveling Downstream Process Bottlenecks
Alternatives to Traditional Unit Operations Are Now Available to Alleviate Very Real Logjams
Due to the complexity of therapeutic protein purification, downstream bioprocess bottlenecks can arise from almost anywhere, from capacity or facility mismatches to scheduling. Although their sources and bottom-line impact may be debated, bottlenecks are real and their alleviation has become a significant activity, to the point where companies are seriously considering alternatives to traditional unit operations.
As product titers rise, negative chromatography is becoming a viable alternative to a dedicated capture step. Negative chromatography captures impurities while allowing products to flow through. BioToolomics is working with several biopharmaceutical clients on negative chromatography for mAbs based on high-capacity, low-cost, single-use media in disposable columns.
Biotoolomics first employs high-throughput methods to characterize impurities and screen its library of mixed-mode ligand resins. The goal, says Chad Zhang, Ph.D., managing director, is a one- or two-column process that clears all impurities to desired levels.
Protein A capture no longer makes sense, Dr. Zhang argues, because of its high cost and failure to concentrate product. “When titers were one gram per liter and the resin had a capacity of 15 grams per liter it was possible, through capture and elution alone, to reduce feedstock volumes 10-fold. But with today’s titers in the 15 to 20 gram per liter range and resins holding 40 grams or so of protein per liter, you need 3 or 4 column volumes to elute the product. Instead of concentrating you wind up with the same or higher volumes than before capture.”
Host cell proteins (HCPs) are a challenge for negative chromatography, but no more so than with traditional capture chromatography. According to Dr. Zhang, HCPs linger as four to five percent of total protein after traditional capture, which is one reason why bioprocessors employ more than one column. The same is true for negative chromatography. Dr. Zhang is confident that negative chromatography can eventually remove HCPs, as well as viruses and other contaminants, to the 10 ppm range.
Biotoolomics hopes that negative chromatography will eventually become a technology toolbox that biomanufacturers turn to for purifying therapeutic proteins. But the technique is by no means one-size-fits-all. “Deploying it successfully requires close examination of a customer’s antibody product, cell culture process, and impurities,” says Dr. Zhang.
Jens Vogel, Ph.D., global CMC development team leader at Bayer, believes the downstream capacity crunch is somewhat overblown. “Downstream bottlenecks do occur in existing facilities producing very high-dose monoclonal antibodies for multiple indications. But based on our experience, for most products and indications, existing unit operations are sufficient if you optimize and scale them properly and work on process intensification and scheduling.”
The real bottleneck for companies like Bayer, Dr. Vogel says, is bringing larger numbers of molecules through the pipeline as quickly as possible. Here, he says, downstream bottlenecks can occur for non-mAb therapeutic proteins for which platform purification processors do not exist.
Bayer is seriously considering negative chromatography to simplify downstream operations, improve quality, and reduce costs. “Negative chromatography becomes more attractive when your upstream platform produces very high titers,” Dr. Vogel says. With high masses of target proteins, he says, it sometimes makes more sense to bind the relatively smaller amounts of impurities. Like most biotech firms, Bayer already uses disposable membrane adsorbers for polishing, a practice Dr. Vogel says will expand as newer generations of membranes emerge.
Bayer’s interest in alternative separations arises in part from its diverse product pipeline that includes complex, highly potent, nonantibody molecules. But if adopted, techniques like negative chromatography must be applied early in process development—not just to mitigate potential regulatory hurdles but to avoid delays in getting drugs into the clinic.
Bayer is investigating negative chromatography with third-party partners and considers the effort evolutionary. “If it comes too late for a particular molecule, at least we have taken it to proof-of-principle stage and can plug it in when the next project comes along.” Dr. Vogel believes that Bayer could eventually develop a purification process based on one packed bed affinity column and two disposable membrane steps—one for capture and one for polishing. Such a system would work particularly well with highly potent, labile proteins but not with large-volume products like antibodies.
Jonathan Romero, Ph.D., senior engineer III at Biogen Idec, believes that some of the upstream-downstream capacity-mismatch is self imposed. “It’s when downstream says, ‘ok, let’s process it all’ that bottlenecks occur, particularly in older facilities designed for lower-titer processes.” And, absent market demand, overproducing introduces logistical issues related to cold chaining, storage, and, in a worst case, product expiration.
Dr. Romero suggests processing the batch to the clarification step and freezing half for later purification. Clarification is a good endpoint since for proteins the big purification cost is entailed at the capture step. Another option is to precipitate the product and store it as a salt.
Bottlenecks don’t usually arise around unit operations but from auxiliary operations like buffer mixing and storage. “Before you know it tanks become larger than the production vessels.” Expanding these areas, Dr. Romero says, involves huge capital outlays. “These bottlenecks are far more troublesome than capacity issues for resins or viral filters. People deal with those. You can still process when they occur.”
Dr. Romero’s group is actively pursuing what he terms “disruptive” technologies such as precipitation, expanded bed, and simulated moving bed chromatography, as well as squeezing as much productivity as possible from existing equipment and facilities. Those new technologies include precipitation, simulated moving bed, and expanded bed.
None have yet reached production levels at Biogen Idec. Dr. Romero cites time, space, up-front capital costs, production-scale validation, and suitability to platforming as the major challenges. “We’re always running the facility, so finding the right time to implement some of these technologies is difficult.”
An even greater concern is the degree to which, say, simulated moving bed chromatography will work with all or most of the company’s proteins, which include antibodies, fusion proteins, interferons, and others. “We’re reluctant to bring in technologies that may only work on one-third of our molecules. So at this point we’re looking for more of a stepwise improvement in the new operations, while squeezing as much productivity as we can from existing technologies.”
Location, Location, Location
Facilities play a substantial role in streamlining or bottlenecking bioprocess workflows. “Designers of new facilities have the luxury of designing everything from the ground up, with upstream and downstream processes in mind,” says Dave Wareheim of Integrated Project Services.
“Greenfield” facilities may even incorporate room to expand or bring in new equipment. But when retrofitting the building is the starting point,“cramming a process into an existing shell requires slightly different judgment regarding interstitial spaces, whether the process will stand on the ground floor or higher floor, how to deal with drainage. Here facility and services can get in the way,” Wareheim explains.
Rising protein titers in mammalian cell culture bring higher upstream productivity and, potentially, lower cost of goods. The negatives are underutilization of upstream space and overtaxing downstream areas that were designed for less-productive cultures. In some situations, Wareheim reports that biomanufacturers have simply abandoned these facilities—not because of bottlenecks but because the plants have become too big for the process. “Utilization of 70 to 80 percent is fine, but when it gets down to 40 or 50 percent people start to worry.”
Where upstream and downstream processing mismatches result in bottlenecks, adopting new separation technologies can help, provided these methods are adopted early enough in process development. An obvious option is high-capacity resins, but these are expensive and, many believe, approaching their theoretical capacity. Similarly, companies might consider ultra high-throughput virus filters, but processors have long complained about the cost and time involved—up to 15 hours per batch.
Wareheim mentions several separation methods that might help. One is fluidized bed chromatography that combines cell removal with affinity-based purification. The advantage, besides a smaller footprint due to replacement of two processes (filtration or centrifugation plus protein A chromatography) with one, is de-bottlenecking of the centrifugation/filtration step.
Wareheim also likes simulated moving bed techniques employing multiple (but smaller) chromatography columns. “This technique takes longer but provides a two- to threefold reduction in resin costs.” Finally, he suggests in-line buffer dilution to minimize the number of storage tanks. “One or two companies are using this and doubling their productivity during buffer dilution with minimal added footprint.”
Parrish M. Galliher, founder and CTO at Xcellerex, believes that downstream processing will be transformed over the next few years to a degree similar to the introduction of single-use products.
“It has to happen because of the pressures placed on downstream operations by improvements upstream.” He acknowledges, however, that for the downstream revolution he envisions to occur, bioprocessors have to turn their thinking and current practices on their heads. “Our goal is 10-, 20-, or 30-fold improvements in throughput and productivity for protein purification.”
The way to achieve this is not by improving resin capacities or enlarging chromatography columns, but by vastly increasing the workload on resins. “Chromatography resins are only working about 10 or 20 percent of the time. Our idea is to get less resin busy all the time, purifying all the time.”
This, he believes, will shrink resin beds to a manageable size, to the point where discarding them after a single use makes sense. He mentions simulated moving bed and expanded bed chromatography as two systems that approach this ideal. “People shied away from these techniques because cleaning validation was problematic. But if you make them single-use, with resins cheap enough to justify throwing them away, you could sidestep validation.”
Another area where disposables could create a breakthrough is in ultrafiltration, a technique used in several places downstream. The current problem, Galliher says, is that scaling ultrafiltration in single-use mode is problematic. “As soon as you go above a one-inch diameter recirculation system the pressures become too great for single-use tubing.” Xcellerex and a collaborator are working on an improvement that Galliher expects will eliminate the single-use issue for ultrafiltration.
One general debottlenecking approach is based on process modeling, a specialty of Philip Lyman, Ph.D., director of process simulation at CRB Consulting Engineers. Clients usually have a bottleneck in mind before they turn to CRB.
“They want to make sure that the bottleneck is at the desired location, and that every other operation they don’t want to be the bottleneck has plenty of capacity.” Modeling helps to locate the bottleneck, eliminate it, and identify the next bottleneck. The process continues, iteratively and incrementally, to debottleneck the entire facility. Lessons learned during this exercise are applied to the process moving forward.
Almost anything can become a bottleneck, for example, a recipe constrained into existing equipment not designed or specified for it may cause overutilization in one or more areas. Another is in support equipment like buffer prep or hold, clean-in-place skids, or shared utilities. One self-imposed bottleneck, scheduling of work shifts, may be overcome by extending work hours to 12 or 24 per day from eight, or to weekends.
“Many different components have to come together, in the end, to generate your desired throughput. Equipment cleaning and turnaround of equipment, utilities for the main process and support equipment, and overarching everything is people or labor. It’s quite difficult to identify a bottleneck after a cursory view of the process. You have to get into it and study it.”
CRB uses several tools for downstream debottlenecking, including Super Pro Designer and Schedule Pro from Intelligen, and discrete event simulation packages like FlexSim (FlexSim Software). All of these software packages are popular outside of pharmaceutical and bioprocessing. “And sometimes, just a simple Excel spreadsheet model will give us the answers we need,” says Dr. Lyman. “It depends on the problem we’re solving.”
Jamie Hintlian, vp of pharmaceuticals at Aspentech, makes the case for applying more sophisticated software-based tools for scheduling and managing the thousands of combinations of time, materials, and processes arising during biopharmaceutical production.
“Producers of small molecule drugs have never had to deal with the complexity in scheduling and planning that biomanufacturers do.” Process industries dealing with only a handful of resources to juggle can often handle scheduling by hand, or with a spreadsheet, and arrive at some ideal combination of operations. “But if you are dealing with hundreds of resources, no single person can oversee them all.”
One Aspentech client, a contract manufacturer, had to deal with more than 3,000 “activities” for one process. When complexity reaches this level, particularly in a regulated environment, it may be inappropriate to employ home-brewed solutions, Hintlian cautions. “This is where systems help, and if you have the right system you can squeeze not just a few percentages more out of your operations, but dozens of additional capacities that you could never uncover by virtue of their sheer numbers.”
In-Line Buffer Dilution
Upstream-downstream capacity mismatches, a major source of downstream bottlenecks, arise because downstream operations cannot match upstream’s volumetric productivity. Downstream capacity can be increased by adding equipment, but each new column takes up space. Since buffer concentrations differ from step to step, added operations often require dedicated buffer prep and/or storage tanks.
In-line buffer dilution—the mixing of buffers “on the fly” from concentrates—significantly reduces buffer storage and prep, reserving precious floor space for ramping up capacity. Benefits of in-line dilution include maximizing existing tankage while minimizing cleaning and associated validation of larger tanks.
Asahi Kasei Bioprocess (AKB) has introduced an in-line dilution technique, IBD™, that reportedly offers several advantages to conventional in-line dilution, among them more accurate dilution and fewer out-of-spec buffers.
AKB uses in-line, temperature-controlled conductivity and pH measurements, rather than more conventional volumetric or flow-based methods, to determine buffer concentrations. The company claims pH control of 0.1 units.
“Since we control dilution based on critical process parameters rather than through mass or volume estimates, the accuracy and precision of the dilute buffer output are uncoupled from the accuracy and precision of the formulated concentrate buffers,” says Kimo Sanderson, vp at Asahi Kasei. In-line measurements are critical components of both quality by design and process analytical technology.
Standard IBD systems turn over 1,200 L/hr of buffer; custom-built units can process from 60 to 6,000 L/hr.
© 2012 Genetic Engineering & Biotechnology News, All Rights Reserved