How do you choose the best, most efficient HPLC method when outcomes are affected by every variation to the methodology? The answer, according to researchers at the recent “HPLC 2006” conference in San Francisco, is to develop a systematic approach to method development based upon a keen understanding of available data and, when practical, to automate it to quickly identify the best methods from hundreds of thousands of possibilities.
Narrow the Options
Phase-optimized liquid chromatography (POPLC™), developed by Bischoff Chromatography (www.bischoffchrom.com), offers “a completely new approach” to liquid chromatographic separation optimization, according to Stefan Lamotte, Ph.D., vision manager for columns and stationary phase.
In this method, “the mobile phase remains constant,” he says, and the stationary phase is optimized. “This is completely opposite to the way optimization had traditionally been done, where the stationary phase remains constant and the optimization of the mobile phase is the focus of interest.”
The method is based on the PRISMA model (developed by S. Nyiredy of the Research Institute for Medicinal Plants, Hungary) for optimizing the mobile-phase composition in thin layer chromatography. With the help of the model, users can create any combination of phases by combining multiple and varying parts of phases A, B, and C. In his example, Dr. Lamotte may use three parts of C18, three parts phenyl, and one part polar-embedded C18 for a separation that cannot be done by using one of the stationary phases on their own. Varying the segments yields different results. In total, Dr. Lamotte says, there are about 800 different stationary phases available on the market today, so “it is difficult for users to find the right column for their separations.”
Bischoff’s POPLC software streamlines the process by allowing researchers to collect the retention times for all analytes on five different stationary phases that are orthogonal in their selectivity and enter those retention times, column length, and void time in the software. The software then calculates all variations that can be done with those five stationary phases, showing the best resolution set and the optimal resolution set, identifying column length and void time, for example. “The number of combinations can be very high,” Dr. Lamotte says. In an example with five different stationary phases of maximum column length of 250 mm that can be split in segments as short as 10 mm, the shortest segments had 142,500 possible combinations.
When using POPLC to separate 33 compounds in a municipal waste water sample using LC/MS/MS, the separation took 20 minutes and had a 5% deviation between the prediction and the actual measurement, according to Dr. Lamotte. “In comparison, the former gradient method required 45 minutes.”
In most cases, gradient elution is not needed, thus lowering separation costs. There is a constant background under isocratic conditions, which increases the use of detection technologies that cannot be applied with gradients (like RI or conductivity), improves ionization in MS detection, and allows the use of UV wavelengths less than 200 nm. Additionally, the column length is adaptable, and much of the mobile phase can be recycled.
Conventional, manual, SPE-LC is labor- and cost-intensive, time-consuming, and error-prone. Dionex (www.dionex.com) has designed an automated process that integrates the extractive sample clean-up process into an HPLC system. The benefits, according to Frank Arnold, director of HPLC marketing, include high responsiveness and cost effectiveness, increased speed and automation, and reduced human contact with hazardous samples.
As Arnold explains, a small solid-phase extraction (SPE) column is connected to a conventional HPLC column using a two-position, six-port switching valve. Online SPE involves fractionating the injected sample into a sample matrix (which is ejected as waste) and analytes (which are retained on the SPE column). Those analytes are transferred from the SPE column to an analytical column for separation and detection. While the analytes are being separated, the SPE column is being washed and conditioned, requiring some degree of coordination.
That coordination “requires exact information on sample matrix elution time, analyte breakthrough time, and analyte transfer time,” Arnold says, to determine the correct valve switching times. This normally challenging process has been streamlined with the use of a dual-gradient pump, a switching valve integrated into the column thermostat, and a software wizard.
Dionex’ software wizard works users through the processes to determine the ideal parameters for the system, along with their significance. It facilitates setting the correct valve switching times, thus helping users determine the loading pump options for the SPE column and establishing the gradient and determining run conditions for the second pump to control the separation of analytes in the analytic column. The software is as straightforward as that used to size pictures or to set up e-mail on a computer.
Automated Sample Prep
LabSmith (www.labsmith.com) recently obtained the license for an automated system for preparing between 1 and 500 microliters of biological samples developed by Sandia National Laboratories. One component of the system is a new cartridge format that is refillable with any type of packing to perform size-exclusion chromatography or SPE of microliter volumes.
Called MAPS™ (Modulated Automated Processing System), it is used for complex protocols for continuous-flow sample preparation, and, according to Gabriela Chirica, Ph.D., research scientist at Sandia, “is very simple and robust. Under optimal operating conditions, it removed the interfering reducing agent from a 10-microliter sample in 70 seconds, using a cartridge packed with Biogel P6.”
The system also uses membranes with various mesh sizes to allow particulates of up to about 20 microns to pass through for direct, on-line concentration and processing of intact organisms at low pressures in a portable system, she says.
Dr. Chirica says that the system is “highly reproducible and 10 times less expensive than commercial alternatives,” noting that she is “still using two-year-old cartridges filled with various materials.”
Particle Size and Performance
As methodologies change, parameters that previously were unimportant are beginning to affect outcomes. The influence of particle size distribution on column performance is one example, notes Uwe Neue, Ph.D., director, external research, Waters (www.waters.com).
The van Deemter equation, he says, is commonly used to describe column efficiency as a function of linear velocity. Two of the equation’s three components, packed-bed uniformity and mass transfer, are related to particle size. It is the interplay of efficiency and flow resistance, Dr. Neue says, that leads to the best selection of packing materials for high-quality separations.
The resistance to flow that can be assessed with the Kozeny-Carman equation also contains the particle size as a variable, he adds. It is the interplay of efficiency and flow resistance, Dr. Neuve says, that leads to the best selection of packing materials for high-quality separations.
In choosing parameters that characterize average particle size, he notes that number and the effect on column performance of blending particles of different sizes, as shown by the comparison of two studies. From an older study performed with radially compressed columns and larger particles, he concluded that backpressure (flow resistance) has a linear relationship with number-averaged particle size while efficiency—“the C-term of the van Deemter equation”—depended on volume-average particle size. “Radially packed columns have uniform densities, independent of particle size,” he says.
A newer study using steel columns and smaller, 2.5-, 3.5-, and 5-micron packing materials showed that backpressure depended on packing density and particle size. Under fixed packing conditions the interstitial fraction decreased with decreasing particle size and backpressure increased with a decrease in the interstitial fraction.
In his experiments, there were departures from the van Deemter equation under isothermal conditions, which indicated that thermal effects influenced efficiency data. He, therefore, developed new equations and concluded that “blending particles has no benefit, only disadvantages.” As Dr. Neuve explains, backpressure is determined by the population-averaged particle size, while mass transfer is influenced by the volume-averaged particle size. The broader the particle size distribution, the worse the situation becomes. Therefore, columns based on blended particles showed inferior mass transfer.
How Pure is Pure?
In terms of sample preparation, Sachem (www.sacheminc.com) uses displacement chromatography to concentrate impurities and thus enhance their detection. “This is primarily a preparative tool, but it is useful in chemical analysis, too,” notes Barry Haymore, Ph.D.
“After loading a sample, our new, high-affinity displacers for ion-exchange act as molecular pistons, pushing the product and associated impurities down the column, giving a well-defined product band and narrow, well-defined transition zones,” Dr. Haymore continues. “The widths of the transition zones are determined by relative binding strengths of the proteins to the matrix and by the quality of the column packing.
“The concentrated impurities reside in the narrow transition zones, while the amount of the product is greatly reduced, thereby allowing close-running impurity peaks to be observed more easily. The transitions zones are sampled or collected as fractions for subsequent analysis using traditional analytical chromatography that can be performed on the same ion-exchange column and/or reversed phase column.”
A great benefit of this method, he says, is that the column displacement mode operates at 50–70% of saturation capacity, thus concentrating the impurity peaks “just before or just after the main peak.” Results have indicated increased sensitivities of 10- to 400-fold for detecting small impurity peaks. The method, Dr. Haymore says, typically reveals low-level impurities not seen in other, traditional HPLC analyses.
In his example of bovine lactose globulin, standard analysis revealed four impurities, totaling 3.02%. Sachem’s two-step procedure revealed 23 impurities totaling 8.25%. In another example, purified porcine somatotropin, traditional analysis revealed four impurity peaks for a 1.9% impurity rate. Sachem’s method revealed 14 peaks for a total impurity measurement of 5.1%.
Prefractionation and Protein Analysis
Ciphergen (www.ciphergen.com) has developed a new way of directly analyzing proteome components in blood serum without the disadvantages of the classic depletion approach, which removes high-abundance proteins and their associated species and also dilutes the sample. “The traditional challenge,” according to Luc Guerrier, Ph.D., head of separation applications, “is that the dynamic range is greater than 10 orders of magnitude.”
In contrast, Ciphergen’s method of detecting and identifying low-abundance proteins in serum relies upon a Protein Equalize™ technology and MultiSelect™ fractionation. In the first technology, the protein concentration range is reduced, using a solid-phase combinatorial ligand library that can dilute high-abundance proteins while concentrating low-abundance species.
“Each bead has a unique ligand structure, and the whole library representing millions of different structures mixed in the same micro-tube, can bind most if not all the proteins from complex biological extracts within minutes,” Dr. Guerrier explains.
The second technology, MultiSelect, is used to isolate a target protein for identification. It uses a predefined sequence of superimposed sorbents to micropurify the protein for identification, Dr. Guerrier explains. “First, a collection of sorbents is sorted and one of them is taken to be capable to adsorb the target protein, regardless of co-adsorption or impurities,” he says. Then, a limited number of complementary resins are identified that are capable of capturing the impurities but not of capturing the target protein.
These resins are assembled under a cascade configuration by placing the resin that can capture the target protein in the last position. The target protein is isolated using a single elution buffer and all monitoring is performed using SELDI MS associated with ProteinChip arrays.
The method seems very promising, Dr. Guerrier says. As yet, “the primary media screening needs to be scaled down, and the process needs to be validated on novel biomarkers.”