Increasing scrutiny and an evolving understanding of the impact of capsid impurities in recombinant adeno-associated virus (rAAV)-based therapies has catalyzed efforts to eliminate them from processing. No single standard approach exists, upstream or downstream.
Writing in a recent paper, scientists from Bristol Myers Squibb (BMS) in Seattle and Lund University in Sweden show how mechanistic modeling for anion-exchange membrane chromatography can enhance rAAV process development, maximizing full capsid enrichment, accurately predicting recovery yield, and rapidly identifying optimal process conditions against selected purity targets within a broad design space.
They found the optimal strategy involves two steps: an isocratic elution of empty capsids followed by isocratic elution of full capsids. They also noted that:
- Purity and yield are affected by Step 1 pH, with 9.0 as optimal.
- Higher purity requirements decrease yield.
- Shorter elution times reduce yields but improve productivity, buffer consumption, and pool concentration.
- Load challenge affects selectivity.
- There is a strong correlation between the salt concentration and wash length in Step 1 and the pool yield and purity in Step 2. Therefore, dialing-in Step 1 parameters is the most industrially relevant goal.
Optimizes process conditions
The modeling approach offers two major advantages. “The first is to rapidly identify processing conditions that obtain the desired empty-full AAV separation,” co-author John Moscariello, PhD, VP, cell therapy drug product process development at BMS, tells GEN. “Once the models are generated, it is relatively straightforward to probe a large design space to understand regions that properly balance purity (high ratio of full-to-empty rAAV) and yield. This is true both generally, and when refining the models to accommodate new information or questions.”
The second advantage occurs when these models are used to support process characterization required to establish a commercial process control strategy. “These models help determine the criticality of various processing parameters based on their effects on critical quality attributes within normal operating ranges,” he says. “They also can be used to establish acceptable ranges to enhance manufacturing flexibility.”
Mechanistic models enable researchers “to rapidly investigate vast parameter spaces, including those that are extremely difficult to study experimentally, such as ligand density,” notes Moscariello.
Conversely, the empirical models usually generated using a design of experiment approach are designed for investigational areas with limited parameters. “Adding additional parameters typically means either significantly increasing experiments or significantly decreasing the knowledge gained, such as the ability to understand interactive effects,” he adds.
Before applying mechanistic modeling, “First, take time to understand the underlying assumptions within the model and ensure they apply to the specific system they are investigating,” Moscariello and colleagues advised. “Second, remember that the quality of the modeling outputs is impacted by the quality of the data inputs. Lastly, while the de novo approach is highly effective, it’s not the only or de facto best option; many companies are beginning to offer off the shelf tools and technologies to support modeling efforts.”