February 1, 2011 (Vol. 31, No. 3)

Researchers Increasingly Rely on Enabling Tools and Strategies to Boost Operations

The bioprocessing industry continues to transform and reinvent itself as a range of new technologies is introduced and adopted. The sequencing of the human genome has made available a wealth of targets for drug development, translating into a demand for appropriate biotherapeutics. Moreover, the expiration of patent protection for many biologics is fueling increasing interest in biosimilars.

Improvements at the upstream end, including vast increases in production titers of cell cultured proteins, has made feasible products that previously would have been ignored because of the initial difficulty and astronomical cost of preparing adequate amounts for pilot studies. Finally, the demand for higher performance in purification processes has motivated biotech companies to drive their technologies to the edge.

Better characterization, cleaning up contaminating material, and overcoming molecular heterogeneity were among the issues that researchers were talking about at “BioProduction” held recently in Barcelona.

Ion mobility mass spectrometry (IM-MS), a recently developed technology for molecular characterization, is a major interest of Rune Salbo, a Ph.D. student at Novo Nordisk.  “We had substantial experience with mass spectrometry, but neither myself nor anyone else in Denmark had worked with ion mobility mass spec, so I invested a lot of time and effort optimizing the parameters of this instrument.”

The fundamental basis of ion mobility is straightforward; it measures how fast a given ion moves against an electrical field through an atmosphere composed of neutral drift molecules. The ions under investigation are introduced into the drift chamber and subjected to a homogeneous electric field where they interact with the neutral drift molecules contained within the system.

Novo Nordisk uses the traveling wave IM-MS developed by Waters in which a sequence of symmetric potential waves continually propagating through a tube propels the ions, and different species transit the tube in unequal times. These modifications allow more accurate determination of fundamental properties of the molecule under investigation, including its overall shape, subunit packing, and topology.

The Waters system is reportedly the first commercially available instrument able to study intact proteins and protein complexes. Prior to this, the only available instruments were hand-designed, a challenge which many investigators chose not to pursue. “My goal in these studies was to employ the new Waters instrument to measure collision cross sections, but the calibration of the instrument failed,” Salbo said. “In a collaboration with Matthew F. Bush and Carol V. Robinson of the department of chemistry at Oxford University, we have now solved this problem.”

Salbo reported on studies using insulin as a model system, with the goal of gaining accurate measurements of the collisional cross-sectional dimensions of the molecule. To obtain an accurate picture, he stressed the use of native-like calibrant molecules that span the molecular weight and CCS values for the insulin molecule. Salbo invested considerable effort in measuring at a range of wave velocities in order to develop consistent results.

Salbo strongly endorses the use of IM-MS for producing a detailed picture of complex molecules. “X-ray crystallography is looked upon as the gold standard for a detailed description of a molecule’s structural properties, but x-ray crystallography and most other methods in structural biology give an average picture of all species in the solution. This means that homogeneity and high purity is essential.

“With ion mobility, you look at each species individually and the presence of other species does not interfere with the measurement. For instance, in a mixture of insulin monomers, dimers, and hexamers, we observe each individual state and not an average of the three combined.”

Researchers at Novo Nordisk use Waters’ Synapt G2 ion mobility mass spectrometer to produce a detailed picture of complex molecules.

Delineating Protein Particles

Contending with various forms of junk that arise during protein purification requires a number of different strategies. Alla Polozova, a senior scientist in analytical biochemistry at MedImmune, described her approach to detecting and eliminating the variety of particulate contaminants in protein solutions that harry investigators.

“We need to identify and enumerate the contaminating particles,” insisted Polozova as she related her group’s experiences with two contentious monoclonal antibody projects. Common types of contaminating materials include polyethylene, cellulose, and various proteinaceous particles, and these are studied using various types of microscopy, including atomic force, electron, light scanning, and chemical.

Particles are counted using light obscuration, which enumerates particles one at a time, or optically with a conventional hemocytometer. Although each method has its advantages and disadvantages, combining them assures that an accurate census of the offenders, ranging in size from 2 to 300 µm, will be obtained. Sub-micron particles are detected using the NanoSight system in which the particles in suspension are detected by light scattering of a laser beam using visual microscopy techniques.

The most noisome particles are protein aggregates, formed by irreversible recruitment of monomers and smaller units. Their genesis can be triggered by a variety of unfortunate events, including dilution, shaking, low pH, and the presence of silicon oil and other materials. Because of their propensity to block capillaries and act as immunogens, they pose substantial risk to patients.

Some visual techniques may be inaccurate because of the state of the solution, requiring multiple approaches. Light obscuration was found to fail at higher protein concentrations, because the particles become “invisible” due to loss in contrast. This can be ameliorated by calibration with protein-like particles, as was done in the characterization of a monoclonal antibody in which particle counts were found to be erratic and inconsistent.

An investigation, carried out to determine the root cause of particle formation, showed that the solutions are dynamics and over time new particles are forming and older ones are merging into larger particles. These particles may be composed of protein, or of proteins fused with contaminating oils or polypropylene.

“The characterization of particles in protein solutions offers unique challenges,” said Polozova. “Interference of background protein, the fragile nature and transparency of particles, and sample handling may result in formation of new particles. This requires a variety of orthogonal methods for reliable characterization.

“For reliable counting, avoid dilutions, but keep in mind that there are multiple particle-formation mechanisms and multiple triggering factors,” Polozova concluded.

Critical Quality Attributes

What do critical quality attributes (CQA) mean to bioprocessing? Anne Kowal, Ph.D., associate director at Millennium Pharmaceuticals considered this question in her presentation. According to Kowal, “A CQA is a physical, chemical, microbiological, or biological property or characteristic that should be within an appropriate limit, range, or distribution to ensure the desired product quality.”

The significance for effective product development is the generation of an integrated and well-defined control strategy, leading to an optimal end result. As Dr. Kowal explained, it is very difficult to determine which attributes are truly critical. What might be observed in a general assay is the knowledge that a parameter is different, but the investigator may never know what the impact on the product quality will be.

“Many inputs complicate the analysis of biotherapeutics. Glycosylation impacts the molecule’s performance and other secondary modifications, such as deamination in the complementarity determining region of antibodies can severely degrade efficacy.”

Other quality attributes may be extrinsic, such as microbial contaminants and the choice of cell culture components. All these factors introduce heterogeneity into biotherapeutic molecules that makes teasing out differences between “normal” and more critical variations a challenging task. While the final goal is to link these attributes to clinical outcomes, this may not be possible in the real world since it would be impractical to test every variation in the molecule in human trials.

Dr. Kowal related two case studies on a monoclonal antibody that the company is developing, describing typical challenges faced in the process. Because of time constraints and deadlines, only existing platform methods were available for assessing the material. Moreover, just very small samples of the product were available for analysis, particularly to support planned process changes, and there was no quantitative charge profile assay for comparability or stability.

Dr. Kowal judged that the deaminated antibody fit the requirements for a known potential CQA. Typical high pI of the antibody made assessment of existing IEF stability data difficult, and the sialylated product was a known potential CQA. Therefore, the development of a quantitative charge profile was given high priority. The Kowal group developed a user-friendly cation-exchange method, which demonstrated that there was a significant amount of general process variability.

“We identified a number of antibody variants present in both the batches,” Dr. Kowal explained, “including dimers, aggregates, fragments, and various minor changes, in short, nothing particularly unexpected.”

Using two robust potency assays, a binding assay and a cell-based bioassay, Dr. Kowal and her colleagues were able to correlate the critical quality attributes of dimer formation and truncation of the molecules with reduced activity of the antibody. “We have found that a strong bioassay is the key to a good evaluation of product variants. Multiple assays may help to determine nuances associated with efficacy when in vivo evaluation isn’t possible. Understanding safety questions can be complex. In the final analysis one may have to rely on general knowledge and whatever can be gleaned from clinical experience (or available nonclinical models).”

Problematic Proteins

The ability to affect protein stabilization is improving rapidly, yet work in this area still presents numerous challenges. Weak protein interactions may impact purification negatively, especially when concentrations are raised in the course of the procedure. This may result in aggregation and conformational instability, proving irresolvable for native proteins. An obvious solution lies in the use of computer modeling to reveal stabilizing mutational alterations into the molecule. Such redesigned molecules could lend themselves more effectively to a robust and efficient purification scheme.

Peter Tessier, Ph.D., assistant professor in the department of chemical and biological engineering at Rensselaer Polytechnic Institute, and his colleagues are investigating nanoparticle-based assays to measure weak protein interactions with the aim of generating proteins more sympathetic to purification protocols.

According to Dr. Tessier, at present no method exists for characterizing these interactions by high-throughput screening of hundreds of samples. His strategy exploits the separation-dependent optical properties between proteins immobilized on gold nanoparticles to detect weak self-interactions.

The Tessier group has further refined the procedures using biotin-avidin interactions to generate protein-nanoparticle conjugates that report protein self-interactions through changes in their optical properties. His approach will allow a broad range of technologically significant phenomena, including protein crystallization and aggregation, to be quantitatively evaluated. The longer-term goal of this work is to use these insights to engineer antibodies that have both high affinity and high resistance to aggregation.

“The technology is mature,” Dr. Tessier stated. “Finding high-performing antibodies is no longer a problem; so there is a broader desire to consider the biophysical properties of the formulation process. You can either deliver re-engineering proteins that you have already isolated, or you can be smarter about your choice of those proteins that you select early on.”

Tessier noted that companies such as Amgen are doing comprehensive studies early on from their candidate pool. For instance they can screen promising antibodies by evaluating their behavior on a protein A column, retaining those that bind effectively without undergoing irreversible changes. “Today there is no question about re-engineering. With that accomplished, the future lies in purification and related issues.”

The improvements in rapid screening, computer modeling, and protein purification protocols are bringing important advances to the industry. Combined with upstream advances, they are moving biologics down a path toward greater convenience and significant cost savings.

Researchers at Rensselaer Polytechnic Institute are using nanoparticle-based assays to measure weak protein interactions in an attempt to generate proteins more sympathetic to purification protocols.

K. John Morrow Jr., Ph.D. ([email protected]), is president of Newport Biotech and a contributing editor for GEN.

Previous articleJ.P. Morgan Healthcare Conference—Wednesday, January 14 Update
Next articleValeant to Buy Generics and OTC Drugs Firm PharmaSwiss for €350M