Leading the Way in Life Science Technologies

GEN Exclusives

More »

Feature Articles

More »
Jun 1, 2012 (Vol. 32, No. 11)

Downstream Bottlenecks More than Just Perception

  • Ultimately what are referred to as “bottlenecks” come down to complex problems of economics rather than straightforward capacity logjams. In a pinch, processors can always enlarge specific unit operations, redesign downstream facilities, or increase daily work shifts from two to three to eliminate bottlenecks. These considerations are always balanced against facility utilization, cost, and product yield and purity.

    Or one can simply build the anticipated downstream capacity into a new facility like Innovent Biologics. Innovent, which develops biosimilars for the Chinese market, is setting up shop in Suzhou, China, where it is headquartered. COO Scott M. Wheelwright, Ph.D., says that despite increasing yields from the upstream side, “we can certainly design a plant to handle the purification” without great concern for bottlenecks. “For us the bottleneck will be the bioreactor.”

    Starting from a clean slate allows facility designers and process engineers to consider process intensification and streamlining such necessities as loading, washing, elution, buffer preparation, storage, hold, and equipment utilization. Cost and scheduling exercises become infinitely easier with a blank sheet of paper in front of you. Plant utilization, Dr. Ho says, is a more serious economic issue than most people realize; scheduling and staggering unit operations to assure that bottlenecks operate at full capacity is one work-around.

  • It’s the Economics

    Wheelwright subscribes to the “economics theory” of bottlenecks, which particularly affects legacy processes with fixed resources. “You can put in a large enough column to run a batch in a single cycle, or you can use a smaller column and multiple cycles.” Downstream processing, under this model, becomes a trade-off between the materials and facility requirements of larger equipment with the time required for multiple cycles.

    “The issues are most critical for CMOs operating older plants,” he says. “Due to upstream productivity they can now bid on a job that requires a smaller bioreactor, but they have problems processing all that protein.”

    Most new facilities, he notes, use process simulation to understand those trade-offs before the plant is constructed. Regardless, there is no easy answer to the apparent lack of economy for clinical batches. Processes optimized for commercial manufacturing attempt to squeeze every ounce of value from membranes and resins. At clinical scale, where time is critical, most firms never realize the full potential of expensive media, whereas during production the cost of high-priced materials and equipment are amortized over many batches.

    The very nature of downstream bottlenecks arise from the cost of purification media, notes Sarfaraz K. Niazi, Ph.D., chairman of Therapeutic Proteins International. “Protein A costs $17,000 per kilo, and a large process may use hundreds of kilograms.”

    His company, which manufactures monoclonal antibodies and cytokines in bacteria and animal cells, has simplified protein purification by expressing and purifying conducting expression and purification in one container. Therapeutic Proteins was awarded a patent in early 2012 for the fully contained process vessel that is 100% disposable, and will soon introduce a 5,000 L model.

    The trick is harvesting protein instead of cells or supernatant. Protein A added to the fermenter adsorbs product while the cells and broth are skimmed off. Product is subsequently eluted off the resin and subjected to further purification steps. This invention is part of the U.S. Department of Defense’s “medicine on demand” initiative. “We can save about 50 hours of process time per batch,” Dr. Niazi says, “while improving yield and lowering costs.”

    According to Anthony Newcombe, Ph.D., who heads process and product validation for GlaxoSmithKline Biologics, quality by design (QbD) applied early in development can help bioprocessors avoid bottlenecks. QbD traditionally aims at improving process knowledge. When adopted early enough, it can help to broaden the design space so that problems like higher than expected titers may be addressed without panic.

    “QbD allows you to assess process risk early, to evaluate key quality attributes and critical process parameters earlier in process design, and to help understand process robustness,” he says. Understanding these factors provides “wiggle room” for facing the unexpected confidently, for example by building in extra capacity. It also works from the opposite direction by providing understanding of less-than-robust operations.

    QbD has been around for several years, but smaller companies remain skeptical about its potential rewards, says Dr. Newcombe. “It’s not fully embraced industry-wide, and it will take a few more years before regulators and industry align on QbD’s requirements.”

    Bioprocess bottlenecks, while hardly product-killers, are a genuine concern for established processes. Biotech companies and their suppliers have done a tremendous job of meeting mismatch challenges and will continue doing so for as long as upstream productivity continues climbing.

    “A fair amount of work still needs to be done on the downstream side,” says Dr. Wheelwright, “but I’m confident that companies and academic groups involved in research and technology development will come up with good solutions to resolve bottlenecks and reduce the cost of downstream processing. I’m looking forward to the next five years.”

Related content

Be sure to take the GEN Poll

Cancer vs. Zika: What Worries You Most?

While Zika continues to garner a lot of news coverage, a Mayo Clinic survey reveals that Americans believe the country’s most significant healthcare challenge is cancer. Compared to other diseases, does the possibility of developing cancer worry you the most?

More »