Critical Control Attributes
“What needs to be in place are critical quality attributes (CQAs),” Stephens noted, “which are the physical, chemical, biological, or microbiological properties or characteristics in a process that should be within an appropriate limit, range, or distribution to ensure the desired product quality.” These must be related to product properties, and they are not usually simple measurements; examples of CQAs are the product’s identity, strength, potency, purity, density, particle size, and moisture content.
Critical process parameters (CPPs) affect CQAs and must be understood, monitored, and controlled to ensure that the process produces the desired quality in a consistent manner. This can be complex for many reasons, including the management of the collected data.
“Particle size and moisture can both be CQAs in a fluid bed dryer,” Stephens noted, “but the controller outputs to the process can conflict when each CQA is evaluated independently.” Because of this, data from the CQAs must be integrated and managed in a single controller database to provide a single controller output to the process, he emphasized. In fact, he claimed that in a full-blown PAT configuration, data management is often the most critical part of the system.
For example, measurement of CQAs often requires advanced analytical sensor techniques using multivariate calibration for converting the analytically measured data to a physical property. Sophisticated software is required to perform partial least squares, multiple linear regression, multivariate curve solutions, noise reduction, and other calculations. This data must be identified and its interaction defined to produce the optimal control strategy. What is not needed, Stephens stressed, is applying a large number of on-line analyzers to measure everything, collecting “tons and tons of data without regard to CQAs.”
“When you get data management implemented,” Stephens concluded, “you can use it to support QbD and RTPR and you can document the product’s design space that allows the use of a flexible regulatory approach as proposed by the FDA.” Citing the ICH document, “Guidance for Industry, Q8 Pharmaceutical Development,” Stephens described design space as “the multidimensional combination and interaction of input variables and process parameters that have been demonstrated to provide assurance of quality.” This is a different approach than earlier GMP guidance, which depended on documented evidence of a locked-down process for manufacturing quality drugs.
Recently in the Journal of Pharmaceutical Innovation, an article titled, “The Financial Returns on Investments in Process Analytical Technology and Lean Manufacturing: Benchmarks and Case Study,” concluded that a PAT system used with lean manufacturing principals can reduce product manufacturing cycle times from about 25 days to 12 days. Big pharma is critically aware that people and equipment used in manufacturing drugs are expensive, Stephens noted, and with savings such as this, and the promise of further savings and improved quality from improved understanding of the manufacturing process, they are wading into the PAT pool.
Anna Persson from Umetrics, expanded on the use of multivariate analysis (MVA) in her presentation, “Multivariate analysis based on design space.” She noted that the PAT guidance document provided by the FDA points out that MVA is a tool that can help ensure that manufacturing process data is utilized to increase process understanding. Typically, pharmaceutical manufacturing data is multivariate, generated in large amounts and comprises complex data sets.
“MVA is an analysis technique,” she noted, “that looks at all the data at one time. The design space is established by design of experiments and visualized by MVA.”
Given the noise and missing data that are common in manufacturing processes, MVA helps users build models for trouble-shooting and to increase process understanding. “We can use design to build quality into the process by defining the operating range of parameters that lead to product quality,” she added.
One goal is to build quality systems into what Persson referred to as the “golden batch trajectory.” Using this trajectory as a frame of reference enables QC operations to move from traditional validation to a process focus.
“One challenge is that data is collected and stored in different places,” Persson noted. “We see variability that can affect quality in both the process itself, as well as in the raw materials. It is important to look at both sources to relate variability to final QC results. This places a high demand on infrastructure to maintain data integrity and combine it into a format that suits real-time monitoring and prediction.”