Bioprocessing is a massive undertaking, and the industry is always on the lookout for more ways to trim the fat and streamline manufacturing processes. This bioprocess intensification can improve efficiency, reduce costs and reagent usage, and even improve final product quality. But how intense is too intense?

“With ultra-accelerated timelines to get molecules to market of sometimes as little as 3–5 years, bioprocess development is now the ‘insane category’,” states Stefanos Grammatikos, PhD, vice president, head of biotech sciences at UCB Pharma, at the recent Bioprocess International Europe Conference in Vienna. He adds: “When we had 8–10-year timelines to get a biologic to market, there was plenty of time for bioprocessing. Now, the commercial manufacturing process may have to be locked down already during Phase I, without us having even treated a single patient with that biologic.”

According to Grammatikos, to rise to the new challenge of clinical acceleration and drive bioprocess intensification, the biopharma industry needs to look for new technologies downstream of the production bioreactor. He presented a number of approaches UCB is using to turbo charge its bioprocessing, including in silico modeling of cell culture parameters and a series of innovations in semi-continuous purification, most notably SCRAM (Single Column Recycling with Asynchronous Multiplexing).

SCRAM – a game changer?

“Process intensification is well-developed in the upstream, but it is not the same in the downstream. The very high titers we’re producing in intensified cell culture are causing issues downstream, and we don’t often talk about this. As the high titers of intensified processes are achieved through very high cell concentrations, large amounts of cell debris–lipids, host-cell proteins, and all kinds of nasty stuff–are sent through primary recovery to downstream. Conventional centrifuge technology can’t cope with this, leading to filtration issues, higher clarification filter areas, antibody reduction events, and risks to affinity chromatography and high-cost resins,” Grammatikos says.

Continuous downstream bioprocessing has become a bottleneck, and many companies such as Pall, NovaSep, and GE Life Sciences have advocated a semi-continuous approach using multi-column chromatography as a solution. “Intensified cell culture is way more complex than fed batch, and such multi-column systems are not always available at commercial scale for purification,” Grammatikos says. The beauty of SCRAM is in its simplicity and in the fact that it runs on hardware readily available in every plant. A single column is overloaded, and the breakthrough overload is collected and reloaded onto the same column, together with fresh feedstream, during the next cycle. This means that the column resin is being utilized more efficiently compared to traditional batch. It also requires less buffer and runs faster than multi-column systems, where the need to synchronize columns generates a waiting time.

To optimize processing parameters for SCRAM, scientists at UCB are using a scaled-down model being run on a Tecan liquid handling system with RoboColumns® (600 µL volume). In a related presentation, Razwan Hanif, principle scientist in downstream process sciences at UCB, presented data from a breakthrough model study of purifying a mAb on protein A scale-down columns with a two minute residence time, which compared a traditional bind-and-elute method with SCRAM. The results showed that, using SCRAM, productivity was 44 g/L/H and column capacity was 67 g/L, which is a 29% and 168% increase (respectively) compared to the traditional bind/elute method. These gains are comparable to lab scale.

Grammatikos concludes: “We have now run SCRAM at lab, pilot, and GMP scale and found that, as well as increasing productivity and resin capacity, SCRAM also reduces buffer consumption by 40%. We are also developing another technique which builds on SCRAM, known as FRAME (Fraction Recycling with Asynchronous Multiplexing of Eluates), in order to address the trade-off of purity and yield in all bind-elute chromatography polishing steps. These techniques, which have been brought about by brilliant minds, industrial collaborations, and small doses of serendipity, could lead to turbo acceleration of intensified processing and may even prove to be disruptive in our industry.”

Mermaid inspired detergent

While much of the conference involved discussions of process intensification, there were other innovations being presented too. Among them was a new detergent to replace Triton X-100, which under a European Chemical Agency (ECHA) REACH (Registration, Evaluation, Authorization and Restriction of Chemicals) regulation, cannot be used in the manufacture of vaccines and biologics after 2021 due to environmental concerns.

Jean-Baptiste Farcet, PhD, senior development scientist, process development at Takeda, explains: “Triton X-100 has been used for over 30 years because it is good at inactivating viruses. However, it has a negative effect on the environment because it breaks down into toxic metabolites–4-tert-octylphenol derivatives–which can bind to and disrupt some endocrine receptors.”

According to Farcet, the ideal replacement candidate must generate minimal manufacturing process change and therefore has to: have similar physico-chemical properties, be soluble, be easy to remove, be eco-friendly, and not degrade into toxic metabolites. To find a molecule like this, the team at Takeda selected detergents with similar chemical structure but which lacked a phenol ring. They analyzed the performance of a range of similar detergents, including a reduced form of Triton X-100, Tergitol, Brij C10, and Polysorbate 20 but found none of the existing detergents met all of their criteria. So, they decided to synthesize a library of new detergents and test them instead.

Farcet says: “Many of our new detergents were bad, but we found one, which we have named Nereid (after the mermaids in Greek mythology), which showed promise.” Nereid has the chemical formula, C15H24O(C2H4O)n and an extra methylene group acting as a spacer, so it doesn’t degrade into phenolic compounds the way that Triton X-100 does.

Johanna Kindermann, PhD, manager, pathogen safety at Takeda, presented work comparing Nereid to a panel of detergents, including Triton-X 100 and reduced Triton X-100 (at 5% and 10 % final manufacturing concentration), to show its relative efficacy inactivating model viruses, including HIV, pseudorabies virus (PRV), xenotropic murine leukemia virus-related virus (XMuLV), and bovine viral diarrhea virus (BVDV). Virus inactivation was examined under both cold and warm conditions, and the results showed that Nereid had similar virus inactivation kinetics as Triton X-100 in warm, as well as in cold, conditions. Also, when using Nereid in single-detergent treatment studies, it attained similar virus inactivation kinetics as Triton X-100 in all conditions tested, demonstrating its effectiveness as a powerful detergent.

To date, the new detergent can be produced at kilogram-scale using a three-step synthesis, and a patent has been applied for. It has also been tested in small-scale runs with recombinant proteins, plasma products, and gene therapy vectors. According to Kindermann, Nereid is scalable and compatible with existing processes and so far has not shown any impact on product activity. It can also be removed from the process for plasma products and gene therapy vectors, though this is still being tested for recombinant proteins.

Nereid-Single-Detergent-treatment
The virus inactivation kinetics for the new developed Single Detergent treatment using one detergent only as inactivating agent.

Kindermann concludes: “In terms of performance, Nereid is a robust and suitable replacement for Triton X-100, and we are currently testing it in ecotoxicology studies against a range of receptors to ensure it is environmentally safe. So far, we believe we have a promising replacement for Triton X-100.”

In silico modeling

A number of technologies presented at BPI Europe focused on in silico modeling to incrementally improve bioprocess performance. On the downstream side, Per Lidén, digital product strategy manager, GE Healthcare Life Sciences, explains why data mining is important: “As an industry, we are generating petabytes of data from bioprocessing, but it is scattered across systems. Scientists are spending 20–30% of their time cutting and pasting data using Excel as a workhorse, and this is a problem.” Lidén presented a case study where a biopharma manufacturer had seen an 8% decrease in their process yield over four years. The company had data from 10 different systems to analyze, and some of the data was even paper-based. GE scientists input all this data into a single “data lake”. Then, powered by visual and machine learning, they were able to model 97% of the Critical Quality Attributes (CQAs) and the Critical Process Parameters (CPP). Lidén comments: “By identifying the points of variance in their downstream process and adjusting accordingly, the company was able to increase yield by 8% and reduce their Cost of Goods (CoGs).”

On the upstream side, Grammatikos detailed the importance of modeling, saying: “We have seen significant acceleration of throughput with robotics and miniaturization of cell culture. But in a wet lab, where CHO cells only double every 20–24 hours, hypothesis testing can take a long time and be costly. Therefore, we need to do in silico modeling.” He presented data which showed that, with UCB’s in silico model, scientists there could accurately predict mAb titers, viable cell counts, and certain product quality attributes of fed-batch, intensified, and perfusion processes. “Using our model we only need eight runs for calibration, and then our cell culture process can be optimized for productivity and product quality with four more bioreactor runs–instead of the 60 runs we would need to do without our model,” Grammatikos says.

A different angle on in silico modeling was discussed by Adam Brown, PhD, lecturer at the University of Sheffield, U.K. Scientists there are looking at what controls expression, rather than trying to find ways of optimizing upstream cell culture parameters. According to Brown, predicting and synthesizing the right combinations of expression cassettes can minimize wet lab clone selection screening experiments. He warns, however, that “designing the right combination is difficult, because it is tricky to get the right amount of expression in the correct stoichiometric ratios.” In his talk, he discussed how a set of in silico algorithms being used at Sheffield can predict how different engineered genetic elements, such as promoters, 5’ UTRs, signal peptides, and 3’ UTRs, will behave when a vector is designed to express three different genes in CHO at low, medium and high expression levels. In silico results were compared to wet lab transient expression studies to determine how much variance occurred from the prediction model.

“In our study, there are 27 multi-gene expression solutions, and the vast majority of our engineered CHO cells showed little variance from predicted gene expression ratios. Using our design platform, we were able to robustly and simply achieve a wide range of targeted gene stoichiometries,” Brown says. He concludes: “We believe using this approach is a robust method of user-defined manipulation of gene expression stoichiometry, which could provide a neat way of doing CHO cell engineering for mAb production and could be used, for example, to tailor the glycoprofile of biosimilars. It could be utilized instead of high throughput screens or to complement clone selection for optimizing expression, and we’re testing it for mAb engineering in collaboration with industrial partners, including a major pharma company.”

Grammatikos captured the essence of the BPI Europe conference by saying “the future of bioprocessing will involve advanced data analytics and well-characterized intensified processes, and we’ll see a bioprocess pipeline in fully robotized facilities.” However, he cautions, “many of these innovations can be daunting and may generate fear.”

Previous articleAI in Biopharma Slowed by Challenges Involving Data, Corporate Culture
Next articleNovel Anticancer Compound Permits Precise Activation and Tracking In Vivo Activity