The biomanufacturing industry as we know it today emerged in the 1980s following the unprecedented early success of recombinant DNA technology.
The approach was launched onto an unsuspecting and unprepared market in 1976 by the founders of Genentech. They pioneered the expression of somatostatin in bacteria (the first human protein produced in microbes) and the launch of three key biopharmaceutical products: insulin (Humulin, licensed to Eli Lilly), factor VIII (licensed to Cutter Biological), and growth hormone (Protropin, launched by Genentech itself).
In those early days, the market for biotech-derived drugs was small, and the companies at the cutting edge focused on gaining product approvals, often after protracted negotiations with regulatory authorities.
The technologies used for downstream processing were the standards of the day. Their function was to extract sufficient amounts of active pharmaceutical ingredient (API) from each batch of fermentation broth, but they were not considered particularly important in their own right. They were often based on laboratory methods and were certainly not designed for either efficiency or scalability, with today’s process-scale manufacturing plants a distant dream.
Even so, the basic building blocks of today’s processes were put in place—cross-flow filtration and centrifugation as the reliable standards for clarification, chromatography as the principal method to separate the API from the large number of contaminating host-cell proteins and other molecules, and, finally, dead-end filtration to sterilize the product for filling and finishing.
Looking back, it seems remarkable how little attention was paid to process efficiency. Most products were required in such small amounts that more than 50% of any batch could be wasted with no economic impact. If demand increased there seemed to be no ceiling on the productivity of bacteria and mammalian cells.
Indeed, titers have increased at least 1,000-fold over the last 30 years through a combination of strain engineering, medium optimization, and upstream process control. Only in the last few years has the industry acknowledged the stress this improved productivity has placed on downstream processing, but this is only one factor in a perfect storm of circumstances that is driving industry change.
In addition to increased demands and higher productivity, there is also more scrutiny from regulators. Perhaps the most important component impacting bioprocess operations is the concept of quality by design (QbD), i.e., the integration of quality control into the manufacturing process to ensure the quality of the product.
QbD was implemented to address waste in biomanufacturing, which, in some cases, saw more than 50% of product batches discarded because they failed to meet post-production specifications. QbD builds quality into the process itself and subjects it to constant in-line testing rather than at the manufacturing end-point. This has had a profound impact on the technologies used for downstream processing.
Today’s biomanufacturing industry is teetering at the limits of its capabilities. In contrast to the laissez-faire approach in the first 15 years, there is now intense economic and regulatory pressure to wring as much product from each batch as possible without compromising quality. This means that downstream process technology is being tested to its limits, both physically and economically.
The most pressing concern is that, because the same technologies have been used since the birth of the industry, we are in some cases reaching the operational limits of each platform.
Chromatography is the most often quoted example because the dynamic binding capacity (DBC) of a chromatography column can only be increased by scaling up the process. However, doing so while still maintaining the same process parameters such as the linear flow rate is impossible. Therefore, either new process parameters must be implemented or the entire process must be redesigned from the ground up.
The limitations of column chromatography have been exposed by increasing upstream titers because bind-and-elute operations are mass rather than volume driven. The amount of buffer can be varied as required but the total mass of API produced in each fermentation batch now often approaches the DBC of the columns and risks product breakthrough.
These risks are lower, but still evident, in polishing chromatography, where the larger fermentor offloads demand high performance to remove excess host-cell proteins, metabolites, and nucleic acids.
Manufacturers are facing the prospect of splitting batches over multiple processing runs, or adding extra columns to handle the increased titer, which significantly increases the production costs in terms of equipment purchase and facility size, while also affecting batch definition and regulatory approval.
The limitations of current downstream processing technology and the fact that manufacturers cannot simply expand out of trouble without increasing their costs in a linear fashion have forced manufacturers and their equipment suppliers to innovate. Some of these innovations have involved incremental improvements in process efficiency, e.g., by developing better resins and buffers to maximize the selective retention of target proteins while efficiently eliminating contaminants.
Some manufacturers have replaced expensive chromatography-based separations with simpler and less expensive solutions, such as precipitation steps that can be used to remove bulk contaminants or even to reversibly precipitate and recover the product.
New technological approaches also abound. Prominent examples include continuous chromatography platforms such as simulated moving bed chromatography to increase selectivity and resolution, the development of novel capture ligands, and new chromatography formats that allow product capture under unfavorable conditions (e.g., salt-tolerant interaction chromatography), and the use of membrane adsorbers in place of fixed columns to enhance convective transport during chromatographic separations.