April 15, 2008 (Vol. 28, No. 8)

Gail Dutton

Throughput Isn’t Where It Should Be, and There Is Some Disagreement as to Why
Like the story of the blind men describing an elephant, the answer to whether a downstream bioprocessing bottleneck exists is largely one of perception.

“The situation is even worse than it was two years ago,” according to Uwe Gottschalk, Ph.D., vp of purification technology, a business unit of Sartorius Stedim Biotech. Ann O’Hara, GM of life science services at GE Healthcare Life Sciences, counters that the presence or lack of a bottleneck “is a matter of optimization. The solutions are here.”

“The issue is partly a matter of optimization and partly the nature of operations,” elaborates Daniel Van Plew, vp and GM of industrial operations and product supply at Regeneron Pharmaceuticals.

Manufacturers generally seem to favor the optimization argument. Whatever it’s called, though, everybody agrees that throughput isn’t what it should be. Managers and scientists within the biotech community are finally aware of the challenge, and vendors are beginning to work to solve it.

The lack of awareness comes largely from the way scientists are trained. “You’re either an upstream expert or a downstream expert,” Dr. Gottschalk explains, “and managers don’t listen to both equally.” It’s similar to the long-touted disconnect between information technology and management, in which business units don’t speak the language of IT, and IT doesn’t speak the language of business. The outcome, in biotech, is a gap in which upstream cell culture advances have outpaced downstream separation and purification advances.

Upstream operations have been an area of focus for a long time because the industry feared a capacity shortage. As you may recall, about 10 years ago, manufacturers were bemoaning the lack of cell culture capacity, and conference presentations deftly outlined a capacity crunch that would leave mAb producers, in particular, woefully short, even if massive investments were made in new facilities. The industry rallied to the challenge. Batches were optimized to become more productive for longer periods. Titers increased.

“At the turn of the century, fermentation was very diluted, and water was the number one contaminant,” recounts O’Hara. Removing the water led to affinity chromatography. Improvements upstream led to much less water, perhaps one-tenth, being used. “Whenever you have improvements like that, the next step shows a burden, so even though the next step may become twice as effective, it doesn’t match.”

With those advances, Van Plew says, “sixty to seventy percent of the titers today are in the one to three grams per liter range.” At titers of one gram per liter, “we’re seeing large cell culture capacities of 15,000 to 25,000 liter bioreactors in sets of four or six,” says Duncan Lowe, Ph.D., scientific executive director, process development, at Amgen. But, he adds, “I’ve seen titers up to 10 grams per liter at small scale,” and titers are expected to reach 20 grams per liter soon.

These titers solved the immediate upstream need but diverted attention from downstream issues, leading to an imbalance between upstream and downstream capacities. Consequently, Dr. Gottschalk says, “we have five times more product coming from fermentation, which requires five times the columns and buffer volumes. Binding capacities, however, haven’t increased that much.”

Van Plew notes that “columns are thought of mechanically, something you can bolt on to expand production.” Until recently, vendors focused on making larger chromatography columns, therefore using more resin to handle the increased upstream yields. “These columns still often need to be cycled many times, which can also result in installing bigger buffer and product pool tanks,” explains Scott Carver, Ph.D., director of manufacturing sciences at Regeneron.

The larger columns are proving unwieldy, though, so they have been less widely adopted than their developers had hoped. “Theoretically, large columns work,” Dr. Gottschalk says, “but technical issues and costs make them impractical.”

Addressing the Issues

Now the industry is beginning to look at higher affinity columns and faster resins to yield more robust, scalable systems that increase throughput. GE Healthcare and other vendors are beginning to concentrate on increasing downstream throughput.

Millipore, for example, recently came out with a Protein A resin with an antibody binding capacity up to 50 milligram per milliliter,” Dr. Carver reports. “That’s a worthwhile increase from the more typical highs of 40 milligram per milliliter, and it’s encouraging that vendors are focusing on increasing resin-binding capacities.”

Another approach is to use membrane chromatography in a flow-through mode to capture the contaminants rather than the products. “That has the beauty of being disposable,” adds Dr. Gottschalk. It can also handle large volumes with increased flow-through rates.

“Our strategy should be to focus more on contaminants than products. Remove contaminants as early as possible before antibody capture.” Early contaminant removal shouldn’t create bottlenecks in subsequent activities, reports Dr. Lowe. “It’s the first capture column with antibodies that’s a concern. Papers have shown that you can scale them up to a point,” but it becomes expensive.

So, it may be correct to assert that much of the technology is available. Availability and implementation, however, are two different things. This leaves room for debate regarding whether there really is a bottleneck in downstream processing. “We feel there is a bottleneck,” Van Plew says. “Downstream purification technologies aren’t well integrated with upstream efforts, which results in spot solutions.”

Dr. Lowe takes the counterpoint. “Although there are constraints of capacity, it’s a condition of what you’re willing to pay to remove those constraints.”

Optimizing current technologies can resolve downstream bottleneck issues but it can be very expensive, points out John Cox, senior vp of global manufacturing at Biogen Idec. “When you’re making clinical products, you’re making just a few batches, so using the most advanced and expensive chromatography resins can be costly on a per-batch basis.”

For large-scale, commercial processes, infrastructure changes such as larger vessels and columns may be required and are also costly. So, although the optimization of technology and engineering can eliminate downstream constraints, improved downstream economics is still needed, Cox notes.

Whether a bottleneck exists often depends upon when a manufacturing plant was constructed. “Capacity varies depending on when it was built,” O’Hara explains. Facilities are planned to provide a certain capacity based upon current demand and future expectations. When upstream improvements are made, downstream capacity is no longer scaled appropriately to handle the increased output, so companies either must retrofit existing facilities or plan new facilities accordingly.

Biogen Idec, for example, is commissioning its second large-scale facility in Denmark, designed with high downstream capacity and throughput in mind, Cox explains. The company has also invested in upgrading its existing facilities, “so new and existing facilities have identical or similar capabilities and technologies,” Cox says. “Higher throughput HPLC resins are an important part of that, especially for mAbs,” as well as harvest and filtration technologies.

Standard, fixed manufacturing remains a good investment for blockbuster drugs as well as some vaccines and perhaps oncology but not so well for other products with less demand. When planning new facilities, management must consider the rapid changes the industry is encountering.

“Generics are coming, dosages are being reduced, companies are sharing markets,” O’Hara says. Also, drugs are becoming more specialized as genetic variations that affect efficacy are taken into account. Companies, therefore, need to move nimbly to respond to those changing parameters. Flexible manufacturing allows that and assuages the gap between upstream and downstream needs.

Modular manufacturing facilities let companies easily change and optimize operations and floor space as needs evolve and technology improves. Flexible manufacturing technologies are good for scale-up, too, Dr. Gottschalk says. In addition to flexibility, “Phase I and II operations can use disposable manufacturing to avoid tying up capital in stainless steel.”

The ability to retrofit modular manufacturing into existing plants also depends upon the facility. “If it’s wall-to-wall, it can be difficult,” O’Hara says. She recommends first taking a Lean approach to manufacturing and an industrial engineering approach to optimize not only the manufacturing processes but the movement from one process to the next. Managing timeframes can help alleviate many bottlenecks, she suggests.

For example, Amgen’s Dr. Lowe points out that today, batch-processing results in chromatography columns sitting idle at times. Staggering batches, however, could increase throughput and relieve bottlenecks by approximating continuous processing. That’s not a panacea, but it’s a good first step to minimizing bottlenecks.

“Flexible manufacturing is important from a forecasting and facility utilization standpoint,” Cox adds. “As companies move toward higher titers and higher purification throughput, flexible manufacturing facilities become strategic tools. A flexible facility can process more production campaigns, handle a wider range of process yields, and, ultimately, dramatically increase a facility’s output.”

These solutions are expected to stand for the next decade, Dr. Gottschalk says. After that, he predicts chromatography will be replaced as a purification step by advances in fractionation, precipitation, and crystallography.

Previous articleVasogen Nixes 85% of Workforce to Focus on Therapeutic Pipeline
Next articlePacira Inks Another DepoFoam Development Deal