Companies that make biotherapeutics are always looking for opportunities to improve efficiency. Consequently, these companies tend to focus on bottlenecks, problem areas that constrain the overall speed and effectiveness of bioprocessing. The key word here is “overall.” It suggests a comprehensive view, not a disconnected collection of close-up views.
If bottlenecks aren’t isolated problems, but process-wide problems, why shouldn’t the resolution of bottlenecks—debottlenecking—reflect a process-wide approach? This is the question that emerged during the preparation of this article, which presents the insights of several bioprocessing experts. According to these experts, debottlenecking can, and should, be pursued with a Bioprocessing 4.0 mindset.
Before getting to the bottlenecks, let’s define Bioprocessing 4.0, which turns out to be its own challenge. “Bioprocessing 4.0 is a term without a solid definition,” suggests Wolfgang Sommeregger, PhD, a product manager at Bilfinger, an industrial services provider based in Germany. “According to the common understanding, it is closely related to the digital transformation of the biopharmaceutical industry.” Most experts would probably agree with this description.
In a bid to clarify the Bioprocessing 4.0 concept, Sommeregger offers an opinion: “It is mainly about connecting, structuring, contextualizing, and making use of data to improve production processes.” With this statement in mind, let’s hear what Sommeregger and other experts have to say about Bioprocessing 4.0’s relevance to debottlenecking challenges.
Bioprocessing’s variety of tools and techniques, as well as different therapeutics, create a complex environment of information. “The potentially useful data sources in a typical bioproduction process can be varied—equipment, sensors, off-line measurements, materials, or personnel, just to name a few,” Sommeregger says. “Consequently, data collection, storage, and aggregation can be challenging, especially in highly regulated industries, such as the pharma industry.”
To make use of data from so many possible sources, bioprocessors need to consider the overall workflow for a particular therapeutic. “For enhanced data collection, the connection of all processing equipment, analyzers, and other devices—the main data sources—to one database or piece of management software becomes necessary,” Sommeregger insists.
Even the age of the equipment plays a role. “In case of older equipment, it may require considerable effort to make the essential data digitally available,” Sommeregger notes. “However, certain novel solutions can help to translate more outdated communication protocols to new standards.”
A company must collect as much data as possible and put it in a usable format. As Sommeregger explains, “To fully profit from technologies that have been proven beneficial in other industries—for example, self-optimization, machine learning, and artificial intelligence—the first and probably most demanding step is the automated integration of the proper data.”
Like scientists in biotechnology and healthcare, scientists in bioprocessing find that they are relying on increasing large amounts of information. “The next big challenge in bioprocessing is to improve and control the process through which big data is collected,” declares Hanna P. Lesch, PhD, gene therapy unit director, Kuopio Center for Gene and Cell Therapy, Finland. She adds that this challenge poses several key questions: How can we collect big data and store it? How can we use artificial intelligence and machine learning to provide useful analysis for process optimization, prediction, and risk mitigation? How can we further improve the efficacy of the manufacturing and quality of the product?
Questions such as these could be addressed with process analytical technology, the use of which is, Lesch notes, “an increasing trend.” To analyze a process, scientists need to collect the desired data during bioprocessing. “Raman spectroscopy and mass spectrometry,” Lesch points out, “provide new options for real-time in-process monitoring.”
That information must be collected and put in a useable form. “So, data integration solutions are needed to centralize the data and connect to controlling systems,” Lesch explains. “Data must be in a digital format—for example, digital notebooks—and it is better to minimize the manual sampling and replace it with online monitoring.”
Dealing with downstream
Almost any aspect of bioprocessing can create a bottleneck, but some aspects may be more obvious in certain operations. For example, in operations that make biotherapeutics based on monoclonal antibodies, the biggest challenge is usually downstream productivity, notes Joe Makowiecki, enterprise solutions director of business development, Cytiva.
“Upstream titer increases and process-intensification techniques are leading to increased product quantity,” he says. “The downstream process needs to be able to effectively and efficiently accommodate this increased product mass.”
Some advanced forms of biotherapeutics—such as the cancer-fighting chimeric antigen receptor (CAR) T cells—face multiple bottlenecks, observes Maria Papathanasiou, PhD, a lecturer in chemical engineering at Imperial College London. In a recent article that appeared in Cancer Gene Therapy, Papathanasiou and colleagues pointed out that the current supply-chain model for CAR T cells can treat only hundreds of patients a year in a country.
“As we move toward more personalized medicines and particularly autologous therapies, the current supply-chain models will have to adapt to meet both patient and provider expectations,” the article’s authors wrote. “Therefore, the pharmaceutical industry needs to rethink the one-type-fits-all model that is currently in place.”
For most of the advanced therapies, Makowiecki says, “Upstream product titers and downstream product recoveries are low.” Both upstream and downstream inefficiencies could be addressed with available technologies, he explains.
For example, Makowiecki suggests that upstream productivity could couple stable-producing, high-titer suspension cell lines with improved cell culture media and feeds, improved or novel harvest clarification, as well as cell retention devices. He also has general suggestions for downstream processing: “Improvements will come from the utilization of next-generation affinity chromatography resins, single-batch high-throughput chromatography devices, as well as single-pass concentration and buffer-exchange devices.”
In biotherapeutic-making “factories,” the underlying technologies can be improved in various ways. Let’s consider the technologies for creating biotherapeutic-making cell lines. “In the process of generating clonal cell lines that produce high levels of a protein therapeutic, scientists typically transfect linear DNA encoding the gene of interest adjacent to a selective marker [such as a drug-resistance marker],” says Claes Gustafsson, PhD, chief commercial officer and co-founder, ATUM. “The DNA fragment is randomly integrated into the chromosome of the host, often in the form of unstable concatemers or as truncations of the fragment leaving only the selectable marker in place.”
Just as in real estate, location matters in a medicine-making cell line. “The genetic location of the insert will drastically affect the expression yield of the clone,” Gustafsson emphasizes. “Accordingly, in a pool of transfectants, only a very small number of clones are expressing the protein of choice at a high level.” So, to find a useful clone, scientists might look at lots of possibilities. “In a typical transfection, one may screen 100 plates worth of clones, which is about 10,000 clones,” Gustafsson notes.
To improve the batting average of making a desired cell line, scientists can take another approach. “An alternative way is to use transposon-mediated transfection. The gene of interest is synthesized into a transposon, and the corresponding engineered transposase—enzyme—actively integrates the gene of interest into the genome with a cut-and-paste mechanism,” Gustafsson explains. “This process results in 10–50 copies of the insert across the genome in almost every cell in the transfected pool.”
Gustafsson asserts that if ATUM’s technology is used, a large portion of the clones will express the gene of interest at a high yield. “None of the clones will be concatemers or truncations,” he continues, “resulting in absolute genetic stability even after greater than 90 generation doublings.” As a result, it can take as little as one or two plates, which is less than 200 clones, to find a clone that expresses the desired therapeutic at the 2–5-g/L level required for commercial production.
The high level of similarity between the transfected pool and the subsequent clones also derisks months of clonal development work, Gustafsson notes. Plus, the yield and quality attributes in the pool will be closely mimicked in the clone.
From start to finish, manufacturers of biotherapeutics must find and resolve bottlenecks to realize the anticipated benefits of Bioprocessing 4.0 and beyond. Some of the problems arise in molecular biology, and others involve automation and analytics. Improvements at any inefficient step, though, will improve how a bioprocess works and how patients will benefit.