Before the advent of biologics, commercialization of a therapeutic relied on successfully scaling up to meet market demand. This scaling-up strategy expanded biomanufacturing capacity by upsizing the volume of the manufacturing train. But with the adoption of single-use technologies, biomanufacturers gained another way to expand manufacturing capacity: “scaling out.” That is, biomanufacturers could add smaller, parallel manufacturing lines. Scaling out is especially pertinent to advanced therapeutics that are tailored to smaller patient populations or rare diseases. The flexible approach reduces the amount of initial sunk costs and can also be used strategically with broader-based therapeutics to react to changes in market demand more quickly and cost effectively.
Prior to process design for either option, representative scale-down models can allow for improved process development and optimization. These models can improve the overall understanding of the process and reveal sources of variability prior to larger-scale investment. Accordingly, their importance should not be understated. Furthermore, in today’s world of artificial intelligence (AI)-driven digitalization and predictive modeling, data infrastructure plans should be of a piece with product-scale strategies.
Which option to choose?
“The decision to scale up or scale out to increase manufacturing capacity ultimately depends on the long-term business strategy of the manufacturer and the anticipated market demand for the product,” said Joseph Makowiecki, enterprise solutions director of product development, management, and technical support, Cytiva.
An established strategy, scaling up amplifies product output by increasing the volume of the manufacturing train through upsizing bioreactor size and downstream systems. On the downside, it typically involves large initial capital expenditures, has less flexibility to adapt to product demand changes, and introduces process changes that can lead to increased development and technical transfer timelines. According to Makowiecki, scaling up requires that the end user “identify, address, and overcome the inherent process and technical risks of scale up to demonstrate product comparability between scales.”
Scaling out utilizes replication, adding more manufacturing lines in parallel, to increase product quantity. It is a relatively new industry strategy. It is supported by the adoption of single-use technologies, and it increases manufacturing flexibility by enabling adaptability to product demand changes, as well as lowering process risk.
“Speed of deployment is possible,” Makowiecki remarked. “The manufacturing train is simply replicated. Scaling-up process or technical challenges are avoided. And the option exists to locate manufacturing closer to the patient population or in a more favorable labor market region. But economy-of-scale benefits are forfeited.”
Scaling out can also affect operational complexity. Additional facility footprints may be required to operate and house multiple manufacturing lines, suites, or sites. The utilization of multiple trains and their inherent variability may also require more operators and quality control resources. Automation can help to reduce the labor burden and human error, while also expediting quality control and product release.
In any scaling strategy, backtracking is suboptimal. A representative scale-down model allows for cost-effective improvements to process development and optimization. Scale-down models increase the overall effectiveness and speed of technical transfer, making them just as important as scale-up or scale-out models. “The good news is that developers and contract development and manufacturing organizations have options,” Makowiecki emphasized. “Understanding each strategy’s pros and cons is critical to select the one that makes the most sense for your situation.”
No one-size-fits-all solution
Biologics production has no one-size-fits-all solution. Each process is unique, with its own set of challenges and requirements. Holistically, a scale-up strategy needs to consider the entire production workflow, from development site to production site, from upstream to downstream processes, and how they interact with each other.
“The upstream scale-up strategy should maximize drug yield and quality while minimizing manufacturing costs and ensuring regulatory compliance,” said Jianfa Ou, PhD, principal scientist, Bristol Myers Squibb (BMS). “Understanding the specific needs and behavior of the cell line is crucial, including growth rate, nutrient requirements, and shear sensitivity.”
According to Ou, it is challenging and impractical to match every aspect of small-scale and large-scale bioreactors. Scale up can lead to uneven shear forces and less efficient mixing. “Optimizing critical parameters like agitation speed and gas flow rate provides uniform nutrient distribution, adequate O2 transfer, and effective CO2 removal,” Ou explained. The scale-up process must also fit within the existing infrastructure and follow the process controls established at small scale to replicate performance.
Computational fluid dynamics modeling can be used to ensure uniform distribution of nutrients, gases, and cells while allowing for shear force assessment. “You want to predict performance and mitigate risk before executing expensive large-scale production,” Ou noted. Integrating computational and biological models that account for cell behavior and metabolic pathways could further improve predictive accuracy.
BMS scientists have also applied both O2 demand and CO2 stripping models as predictive tools to design the agitation and gas flow rates to meet cellular O2 demand and control CO2 levels effectively. This unique model-based, scale-up approach was successfully applied to bioreactors cultivating CHO cells at 15,000 L and 25,000 L scales.
Another example Ou related involved HEK 293 cells with a transfection step to produce a viral product for gene therapy. When transitioning from shake flasks to small-scale bioreactors, BMS scientists observed a 40–50% reduction in product yield. The combination of computational fluid dynamics modeling with product profiles identified shear stress as the probable root cause of the yield reduction and was mitigated by agitation speed adjustment.
Flexible scale out
According to Ruixiang Zhang, PhD, associate director of manufacturing control, WuXi Biologics, the primary consideration in the scale-out or scale-up decision is whether scaling-up risks can be mitigated. Some parameters are nonlinear. During scale up, consistent product quality and process performance can be challenging to ensure. Moreover, as culture volumes increase, so does the difficulty in controlling microbial contamination.
Changes to product demand require flexible and fast manufacturing adaptation. If demand increases, bioreactor volume must be increased and/or additional runs added. If demand decreases, then scale-up manufacturers may be in an overproduction scenario or required to implement a costly scale down. Scale out offers the flexibility to accommodate changing demand dynamics.
“Leveraging scale-out technology, we manufactured more than 3,000 kg of COVID-neutralizing antibodies, and hundreds of millions of doses of COVID vaccine in 2022,” Zhang said. “As of 2024, more than 10 scale-out projects that ranged from two to six bioreactors had successfully completed process performance qualification campaigns and been readied for FDA, EMA, or NMPA filings.”
Scale out uses single-use bioreactor technology and reduces scale-up risks in product quality and process performance. This more flexible approach accommodates a wide range of product levels and market demands and ensures minimal process comparability by deploying multiple identical bioreactors. Consistent process conditions across all bioreactors simplifies validation and qualification processes.
The major concern lies in the increased complexity of operation at the N production stage and harvest stage, in particular. The initial state of each bioreactor may be slightly different depending on the inoculation method at the N-1 to N stages, and the final harvesting state in each bioreactor may also be slightly different as harvesting is typically done sequentially. “However, in our experience, the differences are very small and handled competently through subtle process design and operator training,” Zhang offered. “Product and process performance quality remains very consistent bioreactor to bioreactor.”
Contract manufacturing organizations must meet many different manufacturing demands. For example, WuXi Biologics has three scale-out facilities in China and Ireland. Each plant has different capacities, but every plant utilizes the same bioreactor size. Downstream purification capacity is appropriately scaled to match different upstream volumes.
Problem solving with scale-down models
BEAM-201 is a quadruplex-edited allogeneic CAR T-cell investigational therapy for the treatment of relapsed/refractory T-cell acute lymphoblastic leukemia/T-cell lymphoblastic lymphoma (T-ALL/T-LL). The large-scale manufacturing process uses healthy donors’ leukapheresis composed of 2–3 × 109 cells and yields over 1.5 × 1010 edited CAR T cells.
But cells from healthy donors are not the same. “I was tasked to develop a scale-down model to screen and select healthy donor starting material for the allogeneic CAR T-cell production,” said Tania Emi, PhD, senior scientist, Cell Process Development, Beam Therapeutics.
The scale-down model needed to perform comparably to the large-scale process across all donors. Comparable process outcomes included cell population doubling, purity, viability, and editing efficiency. The BEAM-201 process consists of eight steps starting with cell isolation through final drug product preparation. An ideal scale-down process is scalable across all steps.
Scale-down models are useful for testing different parameters and conditions in a cost-efficient and timely way. “They also help to identify the critical quality attributes and critical process parameters that can influence the final product’s quality and efficacy,” Emi explained. “The GMP sentinel vials from dedicated healthy donor pools had insufficient starting material to conduct large-scale experiments, hence the need for a scale-down model for donor screening and process characterization studies.”
The first challenge was to establish scalable unit operations without altering drug product quality and efficacy. Cell isolation was identified as the critical step causing variability between the scale-down model and the large-scale process. “This was a difficult problem,” Emi recalled. “The variabilities depended on the donors and were not detected until the cells reached the final stage of expansion.” Thoughtfully designed experiments focused on several elements—the magnetic beads, columns, and reagents—and were executed across different donors.
“Successful and timely completion of the project was due to great teamwork,” Emi declared. “Challenges are inevitable in biotech research, and perseverance is necessary to overcome them.”
Data storage conundrum
With wider application of genomics and precision medicine and with greater implementation of digitalization and predictive modeling in biologics manufacturing, data generation continues to grow exponentially. “Seagate puts data first,” said Iman Anvari, director of product management, Seagate Technology, a manufacturer of mass-capacity hard drives. “Data are at the core of many applications. Data matter. Storage matters.”
According to Anvari, the first step in building an efficient, scalable data infrastructure is to plan for the data life cycle. Data are not stagnant and are always on the move. Data get generated, transferred, processed, utilized, and archived. Planning for the life cycle provides clarity and a wider view into necessary requirements. “Simultaneously, you have to address your cloud strategy,” Anvari advised, adding that it could be “hybrid or multicloud.”
A data-scaling strategy is tied to the specific data stage. During processing, both storage and performance are scaled out to feed the pipeline as fast as possible. Whereas when it is time to archive, storage’s importance outweighs performance, so storage is scaled up.
Today’s biggest data hurdle in the life sciences is the growth of data to exabyte levels. (One exabyte is equal to one million terabytes.) “The sheer size of data is the main blocker for some technologies to become a reality,” Anvari remarked. “Few research organizations plan for massive data lakes that also need to be within budget and sustainable. That is why early planning is so critical.”
Any large data lake is roughly composed of about 90% hard drives and 10% solid-state drives for performance. As requirements grow, hard drives need to pack more storage in the same footprint to enable massive deployments.
Strong foundational technology is required to get to exabyte scales. One of the biggest breakthroughs in the hard drive industry, Seagate’s Mozaic 3+ technology, is the first hard drive on the market based on heat-assisted magnetic recording. This allows a 30-plus terabyte hard drive within the same form factor to contain three times the capacity while also reducing power requirements per terabyte by 45%. Seagate plans to expand the capacity of Mozaic 3+ hard drives even further to meet future data needs.