With bioprocessing moving from a niche activity to a standardized business platform, in-line process monitoring and control has become the holy grail for process analytics companies. Until this year there has been a lot of talk but little action. That is beginning to change as companies realize that process analytics offers a real strategic advantage. The action, so far, is mainly in academic labs, but a handful of companies are developing at-process monitoring and other techniques that offer noticeable advantages.
The interest in in-line, real-time processing is part of the need to become more efficient and thus lower costs as well as reduce infrastructure, according to Parrish Galliher, founder and CTO of Xcellerex. With off-line analytics, “you don’t find out how well you’ve produced your product until the end when you test it. Throwing out batches that failed during manufacturing happens often enough, so that there’s an incentive to address it.”
Hence the interest in at-line or in-line methods that can detect problems much earlier. It thus helps companies either salvage batches with adjustments within established parameters or detect failed batches faster. This strategy saves time and money, decreases waste, increases throughput, and enhances flexibility, Galliher says.
At-Line Processing Monitoring
“The industry is looking for tests that measure drug quality,” revealing their structure, Galliher continues. That type of process analytic technology was first used for pharmaceuticals in the development phase. The few such tools for the manufacturing phase tend to be in the early stages of development.
The challenge is more than merely adapting existing technology to an in-line, real-time environment. In biological manufacturing there are very few sensors that tell you what’s going on with the molecule of interest, Galliher points out. Instead, “the industry relies upon parametric sensors—pH, UV, conductivity, etc.—that measure the environment in which you’re making the drug. That’s like walking down a path blindfolded.
“At Xcellerex, we’re actively engaged in online, real-time sensor development, looking at quality and quantity,” Galliher says. “We’re past proof-of-concept and are hardening the sensors in a development setting. The trick is analyzing drug quality and quantity early on, when the drug isn’t very pure.” This ensures the assay analyzes the drug and is not confounded by background contaminants, he explains.
To do this, Xcellerex is exploring many software algorithms and is looking at molecules in different ways and many times to generate a more complete analysis than is possible with the current snapshot-in-time analytic approach.
Millipore is one of a few companies developing at-line process monitoring equipment approximating real-time speeds. The firm’s method is based upon nucleic acid technology, developed in collaboration with Gen-Probe, explains Jean-Paul Mangeolle, president of the bioprocess division of Millipore. It is being designed for microbial contamination in raw materials, in-process samples, and final products.
The technology, called MilliPROBE™, uses Gen-Probe’s target capture methodologies and real-time transcription-mediated amplification to concentrate, amplify, and purify ribosomal RNA (rRNA) without culturing. Consequently, very low levels of contamination can be identified in samples. The first product, released last January, provides an early-warning system to detect Pseudomonas aeruginosa in purified water used in drug manufacturing processes.
MilliPROBE can be used within the quality control lab, although the goal is to optimize it for the manufacturing floor and ultimately deliver in-line process monitoring and control technology. The platform identifies contaminants within a few hours, offering a vast improvement over the three to five days required by traditional culture methods. Validation data showed a sensitivity of approximately 10,000 copies of P. aeruginosa rRNA and greater than 98% detection of actual microbial cells. The assay’s detection limit is below 10 cells per 100 mL of sample.
“When you try to do nucleic acid testing, you’re getting very low concentrations of contaminants,” Mangeolle points out. “Therefore, overcoming background contamination is a challenge. You must move to closed systems that offer no risk of outside contamination.
“To be fast, you have to eliminate the culture step. So 100 to 200 mL are filtered through a membrane, the contaminants are lysed using chemicals or other methods, rinsed, and assayed to measure the nucleic acid.”
Eventually, the MilliPROBE system is expected to replace an important segment of Millipore’s process-monitoring business as the industry moves from growth-based technology to more modern culture-free products, Mangeolle notes.
Celsis International is developing a quick speciation test that within 24 hours identifies the type of contamination problem a product is facing, according to Jennifer Havill, product manager. This nucleic acid-based assay is being designed to overcome “many of the problems of current RNA methods, such as tightly controlled temperature requirements and dangers of contamination.”
Currently, Celsis is working with consumer products and pharma manufacturers to help them convert from RapidScreen to AkuScreen. Both technologies are based on ATP bioluminescence. RapidScreen detects microbial contamination between 24 and 48 hours. AkuScreen is even faster, detecting even minute levels of mold between 18 and 24 hours.
Return on investment typically occurs within six to eight months, Havill reports. “Companies using AkuScreen can release their products to market six to fourteen hours faster than before and several days earlier than traditional methods. That translates into an average of $500,000 net present value.”
Sensor manufacturers credit the FDA’s Process Analytical Technology (PAT) initiative for fostering some advances. As biotech commercializes more products, the need to industrialize manufacturing increases, notes Jim Furey, general manager and founder of PendoTECH.
Disposable manufacturing is another approach to monitoring processes. PendoTech, for example, developed “a disposable pressure sensor that is used in field tests and manufacturing to monitor filter pressure and for bioreactors, to measure gas pressure in head space,” Furey says. That development extends the benefits of single-use manufacturing to monitoring technology. Unlike multi-use sensors, this single-use sensor is developed in two parts — a sensor and a monitor, plus a way to capture the data.
“The sensors aren’t just adaptations of traditional sensors,” Furay asserts. Instead, they use a materials engineering approach that integrates microelectromechanical systems (MEMS), fiber optics, membrane-based chemistry, and other techniques to enhance performance.
“This year, we’re seeing a dramatic growth of interest in this sensor driven by the pandemic vaccines and by the increase in the numbers of compounds entering clinical trials,” Furey reports. Also, the increase in cell therapy is creating smaller batches, which are amenable to single-use manufacturing.
PendoTech’s single-use technology scales from a 100 mL to thousands of liters, with fittings from 1/16th of an inch to 1 inch tubing. The company is also developing disposable sensors for pH, UV absorption, and conductivity, which Furey predicts will be launched within the next 12 months. A single-use chromatography system also is in the product mix.
“Single-use processes are being embraced up- and downstream,” stresses Juliette Schick, president of SciLog. “Single-use sensors,” she says, “are critical to the success of single-use platforms.”
The challenge is, delivering sensors suited for the conditions of single-use applications and platforms. SciLog has a family of precalibrated, ready-to-use sensors that may reduce errors by eliminating the need for field calibration.
“They are ideal for use as components of single-use tubing sets, manifolds, and bundled solutions,” Schick says. Current models are designed to measure conductivity, pressure, and temperature. “At the end of the year, we’ll be adding two more sensors.”
HPLC for Bioreactor Monitoring
More feedback control of the bioreactor is needed to increase the probability of achieving a product’s critical quality attributes, says Rick Cooley, manager, Process Analytics Center of Excellence, Dionex. “Online HPLC can provide high fidelity information about conditions in the bioreactor, letting developers look at the product itself as well as critical feed components. Historically, most companies have utilized tools that only measure the cell’s environment.”
Cooley used online HPLC successfully to monitor peptide production operations. By using HPLC to determine when the product of the correct purity was eluted from the purification columns, overall cycle time was reduced by several days, he notes. “The process analytics enabled new levels of process automation and control to be achieved, which increased process throughout, 10-fold in some cases.”
“The issue isn’t technological. It’s cultural,” Cooley insists. The pharma and biotech industries have a history of ignoring process analytical advances in other industries in the mistaken belief that they can’t be applied to their industry.”
“Other industries are way ahead in terms of how to use information collected from process monitoring,” adds Chris McKenna, vp of services at Symyx.
“There’s no question that biological mechanisms are complex and difficult to control, but there are learnings they can get from industries that have optimized their processes.”
Leading petrochemical manufacturer ExxonMobil, for example, has used feed-forward controls to analyze crude oil as it enters the refinery and predict how it should be processed, Cooley points out. “There’s no technological reason to believe you can’t do that for pharmaceuticals too. People feel that pharma is different from other industries, but that’s not true.”
“Bioprocessing is moving from niche activity toward standardized business platforms,” McKenna elaborates, leaving industry with the challenge of exchanging one-product platforms with enterprise-wide solutions that support multiple products. “We’re working with petrochemical and polymer industries to integrate process automation, process parameter data, and other process data,” he says.
The result is a warehouse in which hundreds of thousands of data points are aggregated for greater utilization and accessibility. Such data access is a key parameter in the quality by design and process analytical technology approaches that, while new to the pharmaceutical and biotechnology industries, has been used successfully by the petrochemical and polymer industries. “We, as an industry, need to change how we’re doing things,” McKenna emphasizes.