GEN Exclusives

More »

Feature Articles

More »
Aug 1, 2010 (Vol. 30, No. 14)

Strategies for Successful Process Scale-Up

Advanced Tools Provide Better Solutions for Critical Aspect of Biologics Development

  • Managing Complex Datasets

    Matt Osborne, Ph.D., cell culture lead at Eli Lilly, said the Kinsale, Ireland, plant is now expanding into biologics with the completion of a €300 million (about $369 million) investment in a facility using cell culture technology for the manufacture of antibody-based protein therapeutics.

    He described work on managing complex datasets to derive an integrated analytical and process control strategy. His presentation outlined the practice of applying QbD principles and managing the complex datasets produced when applying these methodologies.

    An important aspect of this is CQA (critical quality attribute—those qualities of a product that can affect the patient) risk assessment. Tools that have been found useful for this at Lilly include the Cause and Effect Matrix, Ishikawa Fishbone diagrams, and FMEA—all of which can map out what will have an impact on CQAs.

    “There is a huge amount of data from such studies and it is a challenge to integrate it all into something usable,” Dr. Osborne observed. To this end, the company has built a data warehouse and developed a strategy of building a meta model with all the data, allowing for factors such as type of bioreactor, scale of manufacture, and the site where manufacturing takes place. These models are subsequently used to inform design space verification experiments.

    One example of this approach involved a protein A unit operation where a strategy model was built using DoE data. This showed clearly which process parameters are related, and the acceptable ranges (design space) for such parameters. For instance, protein A elution buffer concentration and elution buffer pH have a relationship in design space. Ultimately, everything that could have an impact on CQA is included and mapped onto a “decision tree”, which reveals how the parameters are classified and how these can be used to inform the process control strategy. Dr. Osborne concluded that QbD leads to large and often disparate datasets, and judicious use of statistics is required to integrate and analyze these. “This is one of the biggest challenges we face.”

    New technologies that can give information on large scale at a small scale are of great interest. Joey Studts, Ph.D., associate director of purification development at Boehringer Ingelheim, said that it is important to think commercially as soon as possible in process development. Miniaturization and automation both play an especially important role in Boehringer’s strategy for developing specific, robust processes for the manufacture of monoclonal antibodies.

    Dr. Studts described how the company’s RAPPTor®, a fully automated purification screening platform, contributes to process development. Developed with the help of Tecan, RAPPTor (rapid protein purification technology) allows 96 variables to be measured on 500 samples a day, allowing for early identification of conditions for full-scale downstream processing. The platform is applied as early as possible. “The first chance we get with any material, we put it into RAPPTor,” Dr. Studts said. “It makes a real impact on downstream-processing decisions.”

    Understanding the scalability issue is key to early process learning using automation. “We’ve tried to optimize protocols so they best reflect scale-up.” There are also various DoE strategies that can be used in this context, namely 2-D, multilevel DoE, and full factorial DoE, with this last approach giving the most information.

    In one Boehringer case study, the researchers wanted to find the optimal conditions for a cation-exchange polishing step for a therapeutic antibody. They looked at three resin types, a pH range between 5–6, and both binding and elution conductivity. The readout was yield, host cell protein content, and monomer yield. A clear picture of the impact of each variable was obtained using RAPPTor.

    The sweet spot aimed for was a monomer yield of over 99.2% and a product yield of 85–100%. Conditions for achieving the sweet spot predicted by RAPPTor included a pH of 5.75, and this gave good results on a real column, therefore, the DoE and column results were comparable. “This gave us a lot of process learning from the very beginning,” Dr. Studts said.

    The second case study using RAPPTor also involved a therapeutic antibody. Here, conditions for scale-up were chosen depending upon the most critical output parameter. This could easily be optimized for scale-up 13-fold.

    Getting early knowledge of what the design space looked like for each process step was possible using a QbD approach throughout development and scale-up, noted Dr. Studts. “Efforts should be focused on design space and process understanding in scale-up,” he said. In summary, the aim is to eliminate waiting time by carrying out process optimization and process learning much earlier and at a small scale using RAPPTor, and it has now been shown that data from this screening is indeed scalable.


Add a comment

  • You must be signed in to perform this action.
    Click here to Login or Register for free.
    You will be taken back to your selected item after Login/Registration.

Related content

Jobs

GEN Jobs powered by HireLifeScience.com connects you directly to employers in pharma, biotech, and the life sciences. View 40 to 50 fresh job postings daily or search for employment opportunities including those in R&D, clinical research, QA/QC, biomanufacturing, and regulatory affairs.
 Searching...
More »

GEN Poll

More » Poll Results »

Alzheimer's Therapies

Do you think an effective treatment for Alzheimer’s will be found within the next 10–15 years?