GEN Exclusives

More »

GEN Exclusives

More »
Mar 12, 2014

Implementing a Full Quality-by-Design Strategy

Luxury or necessity for developing the best biologicals?

Implementing a Full Quality-by-Design Strategy

Scientist using Umetrics' MVDA analysis software.

  • Data is King

    “QbD is really all about knowledge management but in reality gaining access to prior knowledge about a molecule to define the TPP and CQA is very difficult,” says Andreas Schneider, vp/international business leader, custom biotech at Roche Diagnostics. “For example, clinical data is not always accessible as it sometimes proprietary to the country or company where the original development or clinical work has been performed. If you cannot find out if a factor is important or not to the CQA, this will affect how you set up your DoE for defining the manufacturing process.”

    Schneider concludes: “Sometimes with a quantitative risk profile of a product you’ll have a moderate or significant ranking, but what you really need are quantitative values to define the cut-off ranges. There is a need to scientifically argue the quantitative ranges and to provide that data with the QbD filing documents, but this requires seamless access to a massive amount of data. At Roche, to ensure we can provide the knowledge management for QbD, we have implemented a project called DAMAS (Data Acquisition Management Analysis System). For DAMAS, we have integrated 200 devices that run and store data generated from 70,000 samples per year from different business units, so that we can really get an overview of the data from our analysis systems.”

    According to Schneider, this integrated centralized data management concept is required as drugs will pass through the FDA filing processes significantly faster if they are linked to an FDA fast-track program. The evidence is that in 2013, the Roche drug Gazyva was launched as the first biologics in a FDA “breakthrough therapy” program, and it was filed with a full QbD approach including the design space. The timelines to pass through the clinical trial phases were significantly less compared to standard filing processes so that Roche and Genentech are going continue to file any future drugs that have breakthrough therapy status supported by data management and QbD principles. This type of resource intensive approach to develop knowledge management for QbD may be available in big pharma but, for smaller biotechs, what is possible in a QbD context?

  • QbD Lite

    Click Image To Enlarge +
    Scientists running a bioreactor with online PAT testing. [Sartorius]

    Most small biotechs don’t have time to do a full QbD submission, but they do use QbD tools and automate their manufacturing processes where practical as QbD is really all about reducing variability. So in a successful biotech what does QbD lite look like? In general, the TPP and CQAs for protein-based therapeutics are defined around upstream bioprocess parameters. Staff performs a DoE using temperature, pH, and feed in shake flasks, 2 L bioreactors or automated microbioreactors to model their 500 L scale production batches. For bioreactor control they use automated pH and DO probes for offline sampling and for online Process Analytical Testing (PAT) they use mass spectrometers to measure gas production and biomass probes to determine cell mass. They also have alarmed remote controls on their bioreactors and use software to monitor the bioreactors remotely. Using automation takes out the element of allowing bioreactor operators “to go rogue” and sample when it is convenient rather than at set time points. Also because staff can monitor bioreactor runs at home, they can bring bioprocess runs on track without losing the batch if they need to.

  • Where Next for QbD?

    Many speakers at the QbD forum agreed that the major driver of implementing a full QbD strategy is to gain a deeper process understanding and in doing so produce more robust and safer biologics. This means that the use of single-use facilities where automation and online measurement is much more prevalent is going to be the trend going forward. Becker states: “A major aspect is product safety, which is primarily affected by the robustness and flexibility of the processes developed. This is precisely where the combination of innovative, robust process analytics, model-predictive control, and flexible automation is going to play a central role.”

    Amit Banerjee, Ph.D., research fellow global biologics at Pfizer says, “For process understanding of the cell culture we have to have good scaledown models, which show equivalence at 2 L, pilot scale, and 12,000 L.” Dr. Rathore adds: “I predict all companies developing biologicals will be using scaledown models and MVDA in the next decade.”

    According to Schneider the biopharm industry is lagging way behind other industries and needs the right data management in place. “QbD has a very strong IT component, which means that data quality matters and the IT stakeholders have to be involved,” he says. “Everyone fears the IT guys but they shouldn’t; if they are involved at the beginning it may add delay, but it will speed up the process in the end if they are fully engaged in the project.”

    “Full QbD starts from a comprehensive control strategy of your raw materials and cell banks,” Banerjee added. “This is time consuming to do and implement, and perhaps leads to the proverbial pot of gold in the end for those companies that can do it well.”



Related content

Jobs

GEN Jobs powered by HireLifeScience.com connects you directly to employers in pharma, biotech, and the life sciences. View 40 to 50 fresh job postings daily or search for employment opportunities including those in R&D, clinical research, QA/QC, biomanufacturing, and regulatory affairs.
 Searching...
More »

GEN Poll

More » Poll Results »

New Drugs for Ebola

Do you think that biopharma companies should not have to go through the normal drug approval process in order to get potential life-saving therapies to Ebola patients more quickly?