However innovative or compelling a novel manufacturing process may be, its value depends on the ability to validate the process and demonstrate its robustness and reproducibility over time. Process validation—both the concept and its implementation—is in the midst of undergoing a sea change that is at the same time driving and being driven by an evolution in regulatory oversight.
At the heart of this change is a shift in emphasis from the product to the process, as supported by the emergence of Process Analytical Technology (PAT) initiatives, FDA guidance emphasizing Quality by Design (QBD) and Design of Experiments (DOE) strategies, and by analytical approaches to identifying and validating critical process parameters and building quality control measures into a process by implementing analytical and validation tools during the scale-down process design stage.
A shift in emphasis from product to process does not imply less rigorous product testing or less stringent QA measures on release testing. Rather, by gaining a more thorough understanding and analytical picture of the unit operations behind the synthesis, purification, and final preparatory steps that comprise a biopharmaceutical manufacturing process, and by ensuring operations within Proven Acceptable Ranges (PARs), as well as maintaining QC throughout, the process should reliably result in an acceptable outcome and a high quality product. Using multivariate analysis to maximize the efficiency of individual process steps, DOE will improve productivity and cost efficiency.
Furthermore, integrating risk analysis tools and Failure Mode and Effect Analysis (FMEA) methods to establish validation parameters may help identify key risk factors that can compromise the safety and quality of drug manufacturing.
These tools, when used to establish and implement a process validation protocol, serve more than just a predictive role. They may also contribute to real-time decision making when a process parameter strays from established targets, which enables early failure of an errant batch or, perhaps, salvage of a valuable product and its raw materials by being able to demonstrate to regulatory authorities that stringent operational and QA/QC measures were in place and that, despite fluctuations in process variables, outcomes, product integrity, and quality attributes are within acceptable limits.
Cynthia Wooge, Ph.D., manager of process development at SAFC Pharma (www.safcglobal.com), defines the goal of QBD as identifying “what information is needed to demonstrate control of a process and to ensure a robust process.” In addition to a reproducible manufacturing process, the FDA now expects companies to provide data demonstrating that their processes are well understood.
“As we move increasingly toward various biologically derived materials, which have more inherent variability in the biological raw materials compared to those used in synthetic, small molecule APIs, whether from native tissue, transgenic sources, or cell-derived products, it becomes increasingly important to identify the factors that control the ability to make a product successfully,” says Dr. Wooge. Factors that impact product potency at various stages of the process and that may affect the removal of impurities, for example, require appropriate analysis and monitoring.
Consider a column purification step, for instance, critical information includes what contaminants are being removed and at what level. The data helps determine whether the operation is being carried out optimally by identifying its “sweet spot” and defining the absolute range around that target that a process can tolerate and still achieve acceptable results. If, for example, the raw materials change, the validation data already generated can be used to assess whether the same column and process parameters can reproducibly achieve impurity removal within the established target range.