February 15, 2007 (Vol. 27, No. 4)

Applying DOE and Risk-Assessment Methods Early Eases Process Validation

However innovative or compelling a novel manufacturing process may be, its value depends on the ability to validate the process and demonstrate its robustness and reproducibility over time. Process validation—both the concept and its implementation—is in the midst of undergoing a sea change that is at the same time driving and being driven by an evolution in regulatory oversight.

At the heart of this change is a shift in emphasis from the product to the process, as supported by the emergence of Process Analytical Technology (PAT) initiatives, FDA guidance emphasizing Quality by Design (QBD) and Design of Experiments (DOE) strategies, and by analytical approaches to identifying and validating critical process parameters and building quality control measures into a process by implementing analytical and validation tools during the scale-down process design stage.

A shift in emphasis from product to process does not imply less rigorous product testing or less stringent QA measures on release testing. Rather, by gaining a more thorough understanding and analytical picture of the unit operations behind the synthesis, purification, and final preparatory steps that comprise a biopharmaceutical manufacturing process, and by ensuring operations within Proven Acceptable Ranges (PARs), as well as maintaining QC throughout, the process should reliably result in an acceptable outcome and a high quality product. Using multivariate analysis to maximize the efficiency of individual process steps, DOE will improve productivity and cost efficiency.

Furthermore, integrating risk analysis tools and Failure Mode and Effect Analysis (FMEA) methods to establish validation parameters may help identify key risk factors that can compromise the safety and quality of drug manufacturing.

These tools, when used to establish and implement a process validation protocol, serve more than just a predictive role. They may also contribute to real-time decision making when a process parameter strays from established targets, which enables early failure of an errant batch or, perhaps, salvage of a valuable product and its raw materials by being able to demonstrate to regulatory authorities that stringent operational and QA/QC measures were in place and that, despite fluctuations in process variables, outcomes, product integrity, and quality attributes are within acceptable limits.

Cynthia Wooge, Ph.D., manager of process development at SAFC Pharma (www.safcglobal.com), defines the goal of QBD as identifying “what information is needed to demonstrate control of a process and to ensure a robust process.” In addition to a reproducible manufacturing process, the FDA now expects companies to provide data demonstrating that their processes are well understood.

“As we move increasingly toward various biologically derived materials, which have more inherent variability in the biological raw materials compared to those used in synthetic, small molecule APIs, whether from native tissue, transgenic sources, or cell-derived products, it becomes increasingly important to identify the factors that control the ability to make a product successfully,” says Dr. Wooge. Factors that impact product potency at various stages of the process and that may affect the removal of impurities, for example, require appropriate analysis and monitoring.

Consider a column purification step, for instance, critical information includes what contaminants are being removed and at what level. The data helps determine whether the operation is being carried out optimally by identifying its “sweet spot” and defining the absolute range around that target that a process can tolerate and still achieve acceptable results. If, for example, the raw materials change, the validation data already generated can be used to assess whether the same column and process parameters can reproducibly achieve impurity removal within the established target range.

Multivariable Analysis

The good news is that the bench scientist is thinking about the long-term consequences of process development in terms of the information needed to support regulatory demands. This has necessitated a greater emphasis on analytical methods, including methods that can be applied to more complex sample mixtures. It has led to a greater demand for real-time inline monitoring and analytical techniques, inline sensing of multiple measures at one time, and enhanced automation to enable more rapid responsiveness.

According to Rajiv Nayar, Ph.D., president of HTD Biosystems (www.htdcorp.com), a key advantage of DOE lies in the ability to glean critical information about the interactions between process variables. Whereas, traditionally, companies would optimize one process variable at a time, “we are using multivariable experimental design to look at many variables in the same experiment,” says Dr. Nayar.

When lot-to-lot variation occurs during scale-up, for example, pinpointing the cause requires identification and analysis of not only critical process variables but also the potential effects of their interactions on the quality of the end product. This approach not only provides a robust process but also validates the importance of critical process parameters.

“A limitation of using DOE is that you must be able to scale down the process to identify the key variables and look at these interactions,” Dr. Nayar says.

“Experimental design is based on doing the minimum number of experiments, or trials, to get the maximum amount of data,” he adds. “It is experiential; you must believe in it,” and it requires education and experience to do so properly. The benefits of DOE carry over from process optimization and validation to QA/QC, enabling more rapid and well-defined assessment.

Defining Design Space

Focusing on the steps leading up to DOE and defining the design space, Amit Banerjee, Ph.D., research fellow at the global biologics worldwide pharmaceutical sciences division at Pfizer (www.pfizer.com), stresses the importance of risk analysis and risk-based process validation.

Risk assessment begins when a commercial manufacturing process has been developed and individual process steps and unit operations identified. Pfizer’s approach, called Right First Time, is based on understanding process performance by integrating PAT, DOE, and FMEA principles.

“You need to prioritize the experiments to be done at a small scale—to identify the key experiments that will give you the most bang for the buck,” says Dr. Banerjee.

The goal is to determine what parameters of each unit operation affect quality attributes and to determine how to measure and monitor these parameters and establish the proven acceptable ranges. Each quality attribute and process parameter is represented in a grid and weighted by applying a cause-and-effect matrix, which results in the assignment of a score for each combination. “The higher the score, the higher the priority to investigate the relationship between the quality attribute and process parameters,” explains Dr. Banerjee, and this priority ranking is then the basis for selecting which experiments to do.

Using aggregation as an example of a quality attribute and size exclusion chromatography as the measurement tool, aggregation during operation of the column might be affected by changes in temperature, load concentration, and pH. A company would then use DOE to establish the PARs and identify the “sweet spot” for this unit operation. With this information in hand, it can then start exploring the design space and determine how to maximize quality, product yield, and ease of operation.

Understanding the design space should enable “a more flexible regulatory approach,” envisions Dr. Banerjee. In addition to process flexibility, it also allows for real-time quality control. “As you introduce more PATs, you may be able to minimize product release testing.” If, for example, the pH varies from the target value in a particular unit operation but stays within the “sweet spot”, knowledge of the design space could allow a company to present to the FDA data-based evidence demonstrating why it is not necessary to fail the batch despite the variation in pH.

“We are hoping the FDA is moving in this direction,” says Dr. Banerjee. “PAT is a critical component of this risk-based approach.” The analytical tools and techniques needed for PAT must now advance rapidly “to enable real-time analysis of unit operations.”

Early Regulatory Interaction

Development of manufacturing processes for cell-based therapeutics represents “a big growth area, with more products moving from concept into early development and clinical trials,” says Stewart Craig, Ph.D., CTO and vp of Progenitor Cell Therapy (www.progenitorcelltherapy.com), a contract development and cGMP-compliant manufacturing services provider for the production of cell-based therapeutics.

Dr. Craig describes a cooperative regulatory environment in the U.S., with a similar situation developing in Europe as regulatory authorities there continue to formulate the regulations that will guide commercial development of cell-based therapies. “We encourage our clients to approach the FDA early,” he says, to discuss the concepts, lay the groundwork, and explain how the company intends to characterize its product and to develop a manufacturing process. Companies can benefit from the experience and insights the FDA has to offer.

Product characterization is one of the key considerations in developing cell-based products, due largely to the variability inherent in the starting material. The material is collected from an autologous patient or an allogeneic donor and can vary from individual to individual. The challenge lies in trying “to level the playing field to make the process for manufacturing reproducible despite the variation at the front end,” says Dr. Craig.

Process validation involves characterization of the starting material and the desired final product, as well as defining limits and ranges for various in-process criteria. Unlike small molecules or biological compounds, these products cannot be characterized down to the molecular level by sequence analysis or mass spec fingerprinting, for example. Instead, developing a product profile must rely more on biological and physical characterizations, such as the identification of combinations of factors, including surface markers, secreted proteins, cytoskeletal characteristics, and aspects of biological function.

QA/QC Validation

A key aspect of process validation and stability testing is defining product potency. “We recommend to our customers that they address this early in the product concept stage when they are defining preclinical needs and the overall development program,” Dr. Craig states. At this time, they should build in an early, broad product characterization profile and assess process criteria within the context of product potency and purity. The goal is “to make QC testing for product release and product stability as routine and validatable as possible,” he adds.

For biopharmaceuticals produced in cell lines of human or animal origin, “one important quality attribute is the virus safety of the product,” notes Hannelore Willkommen, Ph.D., vp of regulatory affairs at NewLab BioQuality (www.newlab.de).

In accordance with the Q5A guideline adopted by the International Conference on Harmonisation, companies must assess the safety of the source materials, the cell line, and any component used to establish the cell line and to cultivate the cells during production and must take steps to minimize and monitor the risk of introducing infectious agents into the process pipeline.

“There is high awareness in the industry that only all three components, the safety of the source materials, the capacity of the process to inactivate or remove infectious agents in the unlikely event that they gain access to the system, and finally, the control of the manufacturing process by direct testing of intermediates, such as the unprocessed bulk material, and assurance that production conforms with cGMP, can guarantee the continuous production of a safe product,” says Dr. Willkommen.

The manufacturing process must be able to remove cell-derived impurities and to inactivate or remove viruses, as demonstrated by virus validation studies performed under conditions that mimic production-scale processes. Dr. Willkommen describes a “trend to more standardized processes for the manufacture of specific products, like monoclonal antibodies, where at least two orthogonal, robust virus-removal steps are involved in the manufacturing process. Dedicated steps for virus removal and/or virus inactivation are implemented.”

Furthermore, regulatory authorities are moving toward allowing the use of in-house data generated with one product to be used to support another, similar product if identical methods are used in manufacturing. “This approach is not accepted at the moment as the basis for virus safety assessment for marketing authorization, but it may be applicable to products in the early phase of development,” says Dr. Willkommen. Eventually, this could help reduce the amount of testing and virus validation studies needed to support a regulatory filing.

Another strategy would be to establish a database that could demonstrate the effect of changes in operational parameters on virus inactivation/removal by a specific unit operation. Although this would initially present “a high burden to perform the virus studies and to establish the data base, it may reduce the work later if manufacturing changes during optimization of the unit operation are considered or if a strategy for development of new products needs to be developed,” she adds.

Scale-down Strategies

Evaluation of virus removal and inactivation methods relies on studies performed on scaled-down unit operations. To demonstrate appropriate scale-down, it is essential to compare operational parameters and analytical data for intermediates both pre- and post-processing. “It may also be important to control the effect that the virus spike might have in changing the composition of the intermediate and, consequently, the performance of the process step. If such influence is observed, the volume of virus added to the intermediate should be reduced or a purified virus should be used for spiking,” says Dr. Willkommen.

FMEA is “designed for upfront process design” and can help identify which steps require extremely tight operational control limits and which can accommodate more latitude and still yield acceptable outcomes, explains Dr. Wooge. Another function of these risk analysis tools is to identify which steps in a process are scale-dependent and which are scale-independent. As the goal is to do validation of manufacturing processes at benchtop scale, it is essential to be able to show that scale-down parameters identified during the design stage are proportional to manufacturing scale.

Dr. Wooge notes the value of improved small-scale equipment for process validation that has come onto the market. These bench-scale models “are more proportional and representative of manufacturing scale, so you can trust the data,” she says. The vendors recognize that if they can capture customers at the design stage, then they are more likely to gain and retain their business throughout scale-up and manufacturing.

Previous articleInvitrogen Sells Contract Services Business for $210M
Next articleRBM Awarded $750,000 from DoD