January 1, 2008 (Vol. 28, No. 1)

Elizabeth Lipp

Sometimes Answering the Big Question Begets Bigger Questions

When one considers protein profiling, the only immediately obvious conclusion is that it’s a complex topic—so much so that CHI’s 7th Annual “Pep Talk” in San Diego is devoting six tracks and four days to the discussion later this month. Exploring challenges with protein expression, peptide and protein-based therapeutics, and mining the plasma proteome are priorities for attendees.

Preceding the week-long conference is a one-day CHI-sponsored meeting of the Clinical Plasma Proteomics Consortium (www.clinicalplasmaprofiling.net). “It’s a fairly small conference, but the attendees are enthusiastic and there is a lot of great discussion,” says Joanna Hunter, Ph.D., director of protein analysis at Caprion Proteomics (www.caprion.com), who is chairing the Consortium.

Peptide MRM-based Assays

One talk is geared toward examining mass spec strategies for biomarker verification using peptide multiple reaction monitoring (MRM)-based assays for targeted quantitation of proteins in plasma. “We’ve been focusing on developing tools to enable these MS-based verification experiments, specifically honing in on targeted MRM profiling,” says Christie Hunter, Ph.D., senior staff scientist at Applied Biosystems (www.appliedbiosystems.com).

“We’ve been focused on how to best generate highly reproducible assays on large numbers of peptides/proteins that can be used on the larger patient sample sets required for biomarker verification.”

The technical challenges to the preliminary validation of putative protein biomarkers include detection of low-abundance proteins in complex tissue or biological fluids in addition to high-throughput, high-precision quantification.

The technique employed by Dr. C. Hunter takes a targeted approach to protein profiling. “There has been a definite gap in the biomarker research pipeline, many proteins found by MS discovery techniques are never taken to the next level of validation because of the difficulty in generating specific assays. A highly multiplexed, targeted MS assay that is rapid to develop will hopefully go a long way toward addressing this problem.”

Dr. C. Hunter’s group was tasked with rapidly developing MRM assays and leveraged the MIDAS workflow on the 4000 QTRAP system. “We worked out a number of strategies to develop high-quality MRM methods to obtain the best sensitivity and dynamic range for peptides from a panel of putative protein biomarkers for cardiovascular disease,” notes Dr. C. Hunter. “For previously detected proteins, MRMs were designed based on peptide MS/MS spectra. For the remaining candidates, MRMs were developed either by in silico design, based on gene or protein sequences, or by targeted, direct detection using the MIDAS workflow.”

The ability to compute in silico the MRM transitions for the proteins a researcher wants to verify and test as well as confirm these experimentally is a key point Dr. C. Hunter’s group wanted to address, she reports. “The ability to look at a protein and have the instrument tell us what the best peptide is to develop MRMs for is proving to be a critical component,” adds Dr. C. Hunter. “We can almost always find a better peptide or MRM transition to use for an assay from a MIDAS workflow experiment, and this translates to more robust assays with often dramatic sensitivity increases.

“The other advantage of a targeted workflow is that you see only data you are looking for and you always see the data you are looking for,” she adds. “It’s highly accurate and highly reproducible.”

Biomarker Verification Is Critical

Leigh Anderson, Ph.D., founder and CEO of the Plasma Proteome Institute (www.plasmaproteome.org), explains that the bottleneck happens when there is no feasible way to analyze 1,500–2,000 samples. “If you were to ask how many people have ever analyzed that many samples, the answer would be nobody,” says Dr. Anderson. “You can’t do it with the current discovery platforms; you need alternative technologies. It’s a difficult problem, and there is a lot of enthusiasm in the community to expand the technology portfolio to solve the problem.”

The shortfall in new protein clinical diagnostics emerging from proteomics research appears to reflect a lack of critical biomarker verification capacity rather than a failure in biomarker discovery, in which large numbers of candidates are routinely uncovered, notes Dr. Anderson. “The primary problem is that over 95 percent of these candidates fail as reliable disease-specific indicators in the general population, where biological noise often overpowers the weak biomarker signal.”

Dr. Anderson says that one way to approach the bottleneck is to place primary emphasis on development of technology and collection of sample sets specifically tailored to support statistically definitive tests of large panels of biomarker candidates. “There are two ways to solve the bottleneck. One is to miniaturize immunoassays, which is what the array people do, and it’s a very interesting technical approach, though limited by cost and assay quality issues,” notes Dr. Anderson.

“The other way is quantitative mass spectrometry. Of course sensitivity is an issue with the MS approach, so what we’ve done is create a hybrid with immunoassay sensitivity and mass spectrometry specificity, a method called SISCAPA.”

This approach incorporates many MS tools (e.g., MRM) that have been around for at least five years for measuring drugs and metabolites, according to Dr. Anderson. The novelty, however, is in combining MS with immuno-enrichment and applying the combination to peptides and proteins. “It’s a great way to get an accurate measurement of a series of specific targets,” Dr. Anderson explains.

The key thing, Dr. Anderson says, is that an argument for a streamlining of the validation process is needed. “It has been challenging to get people on board because discovery has been perceived as more fun and higher leverage, but more people are getting smarter about it, especially once they see that biomarker candidates don’t advance themselves to the clinic.”

While some bemoan that technology is not keeping pace with discovery, there’s lively discussion that suggests the opposite is true. Dr. J. Hunter, in addition to chairing the proceedings, will also be presenting recent proof-of-concept studies Caprion has done. “One of the biggest challenges I see is not the technology itself but the uses of that technology,” says Dr. J. Hunter. “The technology is capable of answering the questions we ask, but it’s a matter of overcoming the perception that it’s not.”

Using Existing Technologies Wisely

One of the examples Dr. J. Hunter will be using is a test case her company performed for a client. “We did a 48-sample study, where the source of 24 samples was known (breast cancer, ovarian cancer, or healthy controls),” reports Dr. J. Hunter. “Based on that data, we trained our systems to identify the source of the 24 blinded samples. We’ve also completed a similar internal study with prostate and lung cancer plasma samples where a training set was used to correctly identify a validation sample set. I’ll be presenting several studies that demonstrate how the technology can be used, rather than focusing on the technology itself.”

The technology is sensitive enough to distinguish different cancers as a result of a platform built on a strong QA environment, notes Dr. J. Hunter. “Great custom software, attention to detail, and SOPs at every step make the difference.”

Recounting Recent Advances

Using well-established techniques in novel ways is a hot topic. Chris Becker, executive director of PPD (www.ppdi.com), says that PPD’s position—working with both pharma and academia—has given it the opportunity to publish work in applying proteomics to diagnostics and drug development. “What we’ve done is created a viable commercial offering that’s reproducible, SOP-driven, and has QA and QC procedures in place,” notes Becker. “No one wants to hear excuses. People want information. We only get one shot, so we want to get it right the first time.”

One of the challenges Becker faced was the fact that there is bias in the field. “It was really a sort of heresy, a bias in the field that you couldn’t do quantitative mass spectrometry on proteins unless you used isotopic labels,” reports Becker. “But the industrialization of the process, the ability to automate the system so that all samples can be processed the same way, has made label-free analysis possible.”

Becker says that quality control is key. “Every tenth sample has to be evaluated for reproducibility, consistency, and having good software—all these ingredients are key. And the software, which we created here, is important. Because the data is so complex, you have to make sure you have a robust platform that will support its processing. It’s not a trivial matter.”

One study Becker will discuss involves a sample of 30 subjects, half with lymphoma of the central nervous system. “Some controls had multiple sclerosis, and some had other neurological problems. We looked at the cerebrospinal fluid differential concentration for 500 or so proteins,” says Becker. “It’s a good story because it was careful and reproduced. It’s an example of what can be done in the field, if it’s done carefully.”

Nanospray Technologies

Tony Tegeler, senior research scientist at Monarch LifeSciences (www.monarchlifesciences.com), will be discussing the development of nanospray technologies for biomarker discovery as well as the LifeMarker™ assay—a mass spectrometry-based targeted assay for biomarker validation and application.

“Basically, for biomarker discovery we use a bottom-up approach—samples containing proteins are submitted by our customers (typically large pharmas or biotech), and Monarch performs a patented reduction and alkylation procedure followed by enzymatic digestion. The peptides are then analyzed by mass spectrometry,” says Tegeler.

Monarch has acquired a unique approach for biomarker discovery—label-free relative quantification. “We use algorithms that provide customers with pertinent and reliable protein information (up- and down-regulated proteins from relative quantification) from small sample volumes,” notes Tegeler.

“We can also follow-up on interesting biomarker discoveries by using our targeted LifeMarker assays. These mass spectrometry-based assays can provide absolute quantification for selected proteins and are significantly more rapid to develop than immunoassays. They are a great tool to enable and accelerate biomarker validation and applications.”

Tegeler’s talk will focus on novel uses of liquid chromatography for protein profiling. “When chromatography was in its infancy, you needed large columns and a lot of sample, measured in milliliters,” explains Tegeler. “Now, with nanospray technologies, we are able to work with samples that measure only in microliters—very small, but packed with information.”

The LifeMarker assay allows a researcher to quickly determine the absolute quantity of a specific protein or panel of proteins. Moreover, these assays, unlike most ELISAs, can discriminate among closely related forms of specific proteins such as differentially modified proteins making them a useful tool for most biomarker applications “It’s a powerful technique that gives good quantitative results,” says Tegeler. “With this targeted approach, you get better data, which gives pharmaceutical companies the confidence to pursue leads into the clinic.”

Monarch has already developed targeted LifeMarker assays for many proteins. “We have off-the-shelf assays in various therapeutic areas available today and continue to add to that list,” reports Tegeler.

Low-abundance Serum Proteins

“Protein biomarker discovery in serum is challenging because of the vast dynamic range and high number of proteins circulating in serum,” notes Koen Kas, CSO of Pronota (www.pronota.com). His talk will focus on how complexity reduction of the proteome can be achieved through the sub-selection of particular peptides using a range of different procedures.”

Often, the building blocks of the protein are known to scientists, but some proteins are prisms, which makes profiling challenging. “What we’ve done is create a biomarker discovery platform by coupling smart chemistries and peptide separation techniques with novel means of mass spectrometry,” explains Kas. “The result is called MASStermind.

“In the N-teromics Cofradic technology, which constitutes the core of the MASStermind platform, every free N-terminal tryptic peptide of a protein or protein fragment is tagged,” says Kas. “Diagonal chromatographic steps select and separate only these N-terminal fragments, whereas all nonterminal fragments, generated by the tryptic digest, are removed from the peptide mixture, thereby significantly reducing the complexity in comparison with standard proteomics technologies.

“This N-terminal selection strategy demonstrates an unprecedented power to discover low-abundant serum proteins and to reveal that serum proteins are subject to unexplored cleavage by endogenous proteases,” explains Kas. “For the first time, using a nontargeted proteomics approach, we have shown that it becomes possible to get into the nanogram/milliliter sensitivity range. The result is greater diagnostic power. If you have a fishing pool of a few thousand makers, not just the high-abundant acute-phase proteins seen by others, then you have a better chance of finding candidate markers. And that’s our goal—to increase the biomarker fishing pool—to make it so large that we can find low-abundant, specific, and sensitive biomarkers for better disease regimens.

“As these markers were discovered in an extremely small sample set of patients with the aim to diagnose sepsis, the next challenge is to confirm the power of combinations of those novel set of markers for disease in large populations.”

Previous articleFirms that Were Willing to Bankroll the Largest Take-Overs in 2007
Next articleBRCA1/2 Mutations Increase Survival in Ovarian Cancer Patients