June 15, 2008 (Vol. 28, No. 12)

Kathy Liszewski

Better Dialogue from Discovery to Clinic Could Be Key to Improving Process

Biomarker discovery has exploded in the last decade, yet researchers often struggle to validate their biomarkers for widespread clinical use. Several recent meetings including CHI’s “Molecular Medicine Tri-Conference” and “Biomarker World Congress” as well as SMi’s “Biomarker Summit” highlighted current challenges, offered new insights, and presented cutting-edge technological advances.

Today, almost every journal features an article on next-generation biomarkers. “So, why isn’t the market flooded with new biomarkers?” asked Stephen M. Hewitt, Ph.D., clinical investigator and chief of the tissue array research program in the laboratory of pathology, center for cancer research, NCI.

“The answer is that many biomarkers lack clinical utility. For a new biomarker to provide clinical utility it must be better than current grading and staging. To prove it is better, there are two challenges: First, there must be identification of appropriate cohorts of patients to perform validation, and second, there must be high-quality clinical annotation of those samples.”

According to Dr. Hewitt, development of successful biomarkers should be viewed in the same perspective as developing a new drug. “It is a process of screening, discovery, and validation, where large numbers of candidate biomarkers are excluded in the process.”

Dissecting out protein-expression patterns from tissue can help identify clinically useful biomarkers. “The tissue microarray has allowed cohorts to be put on one platform, often one slide. Immunohistochemistry is a powerful tool to study protein expression. The challenge here is moving from a research assay to understanding the nuts and bolts of a relevant assay for use in a clinical setting.”

The path toward clinical utility requires examination of analyte specification as well as assay performance to verify useful biomarkers. “We need to conquer issues of inter- and intralab reproducibility,” Dr. Hewitt said. “This requires considering the use of calibrators (more than negative and positive controls) to ensure low expression is assayed as low and high expression as high. In addition, there is no such thing as a standardized protocol for fixing tissue. Yet data for biomarkers can differ depending on which fixative is used and for how long. We need to adopt a mentality of rigor that is used for ELISA and other chemistry assays.”

Qualification

Deciding which biomarkers to pursue and which to scrap depends largely on evaluations centering on biomarker validation and qualification. “These two standards describe very different but critical processes,” noted Begona Barroso, Ph.D., associate director, bioanalysis, Astellas Pharma (www.astellas-europe.co.uk).

“Validation refers to the process of assessing the assay, determining its suitability, and measuring performance characteristics,” Dr. Barroso explained. “Method validation for biomarkers is a process of assay refinements that are generally performed using the fit-for-purpose approach. This broad term means you must demonstrate that the method is reliable for the intended application.”

Ann Hastings, Ph.D., scientist, translational science section at Astellas, agreed. “Your biomarker is only as good as the methods you use to identify and characterize it. For biomarker validation, it’s important that assays be accurate, precise, and lie within a linear range. Other critical aspects include specificity, sensitivity, stability, method robustness, and good document control.”

A second key ingredient in the successful selection and implementation of good biomarkers is called biomarker qualification. “Qualification links the biomarker with biological and clinical endpoints,” Dr. Barroso said. “It really refers to clinical validation based on biological and statistical consistency. The marker must be consistent with pathophysiology and connected with an intervention pathway. Moreover, changes in the biomarker must be correlated with clinical outcome and reflect changes in prognosis.”

Ultimately, one of the best routes for selecting good biomarkers lies simply in an improved dialogue for all scientists on the project, from discovery to clinical researchers. “You really need good communication in every aspect from preclinical to clinical,” Dr. Hastings said. “Identification of the most relevant biomarkers can bridge the gap between preclinical and clinical.”

DNA Methylation

DNA methylation is critical for a variety of biological processes that range from imprinting to stem cell formation, making it a potential biomarker for both prognostic and diagnostic applications. “Methylation of DNA is a well-known and important regulatory mechanism as well as a signal for certain cancers,” noted Cathy Lofton-Day, Ph.D., vp of molecular biology, diagnostics at Epigenomics (www.epigenomics.com).

Dr. Lofton-Day presented her work at the “Biomarker World Congress” last month. “In the last 20 years, scientists have realized that gene promoters become methylated and turn off. When the promoter is for a tumor suppressor gene, this has been linked to development of cancers.”

Because changes occur early in the neoplastic process, detection of methylation alterations can allow early detection of cancer. In particular, Dr. Lofton-Day has been studying early detection of methylation in colorectal cancer (CRC). “Our goal is to develop a noninvasive test for colorectal cancer. When detected in its early stages, survival is greatly increased. The problem is that current noninvasive tests are based on detection of blood in stool and people do not use them.

“Colonoscopy is invasive, expensive, and can be associated with some morbidity and mortality. Our strategy is to detect free genomic DNA in blood that is derived from tumors. To achieve this, we identified a region of the Septin 9 (SEPT9) gene that is methylated in more than 90% of colorectal cancer tissues with little or no methylation in normal colon tissue or other controls. This suggests it can be used as a biomarker in blood indicating the presence of colorectal cancer.”

The technology initially employed to identify the methylation of SEPT9 involved a methylation-sensitive, restriction enzyme-based detection method. “Once we identified SEPT9 as a suitable biomarker, we developed a highly sensitive real-time PCR-based assay in which the DNA sample is treated with bisulfite allowing us to identify methylated from nonmethylated DNA,” Dr. Lofton-Day reported. “We verified the marker performance in blood in case control studies of more than 3,000 patient samples.”

Now the challenge is to make the assay “clinic friendly” and perform clinical trials. To do that, Dr. Lofton-Day has geared the assay for high-throughput applications and has launched a multicenter clinical trial targeting the CRC screening population. “We are excited about the potential of the Septin 9 marker and especially this new and emerging technology.”

Poor reproducibility of the methods used for qualitative and quantitative proteome analysis can severely restrict biomarker discovery as well as validation, according to Peter Schulz-Knappe, Ph.D., CSO, Proteome Sciences (www.proteomics.com).

“Currently, there is no universal technology for proteomics. It is segmented into many different technologies and workflows, because there are a myriad of proteins and peptides each possessing different profiles as to biochemical activity, size, form, etc. Reproducibility of studies is poor and comparability between labs is often not possible. A comprehensive and sensitive technology is needed in order to improve this situation.”

One of the newest and best technologies for biomarker discovery and development is to use mass spectrometry combined with quantitative tags for analysis, noted Dr. Schulz-Knappe. “At Proteome Sciences, we have patents covering the field of isobaric mass tags and have introduced Tandem Mass Tags® (TMT).

“All tags have the same overall mass and identical physico-chemical properties. After individual labeling of up to six different patient samples, the samples can be mixed and processed together. Labeled proteins behave identically during all steps such as sample handling and chromatography, but since the mass tags have individual fragmentation patterns, the patient proteins can be quantified specifically during mass spectrometry. TMT’s only prerequisite is that you must be able to label the sample and that the analyte can be detected with MS.”

The idea behind the TMT technology is that the quantitative ratio between individual proteins remains constant after mixing of samples. “This conservation is key for subsequent separation and analysis. It allows much better control, precision, and accuracy to study hundreds to thousands of proteins.”

Proteome Sciences also made advances in the development of reference standards for proteomics. “Clinical laboratories typically use reference materials to calibrate and certify assays. Unfortunately, proteomics lacks such materials to benchmark performance. We decided to address this need and now provide TMT-labeled reference materials for blood plasma, urine, and other samples. These materials serve as a defined proteome reference. Using this as control sample, it is now possible for scientists to have quality control and share their results worldwide. We expect reference materials to help speed up discovery and development of biomarkers.”

Building Robust Bioassays

To fill the gap between biomarker discovery and clinical utility, robust assays must be developed, said Philip M. Hemken, Ph.D., principle scientist, cancer research diagnostics, Abbott Laboratories (www.abbott.com). “Investigators need to be rigorous in validating biomarkers early in the process. There are several key points for developing a robust assay.

“First, there needs to be suitable product definition to clarify intended use and define product requirements (e.g., sample precision, both analytical and clinical specificity and sensitivity),” Dr. Hemken pointed out. “Secondly, there are preanalytical considerations such as accounting for specimen tube type, specimen stability, biorhythm variability, male versus female, and age.”

It’s important to verify the suitability of the assay design, Dr. Hemken added. “For example, it’s crucial to determine the best format such as competitive versus sandwich assays, the range of the curve required, and buffer components. Design of experiments can be used to optimize the reagent design.”

Another consideration for making the assay robust is to carefully characterize all reagents in the assay. “Questions to ask include: How pure are my antibody and antigen preparations? Is there any aggregation? How stable are they? What is the specificity of the antibody? Does the recombinant antigen behave similarly to the native antigen?”

Finally, Dr. Hemken advised carefully scrutinizing the analytical performance of the assay. “Important aspects include determining reagent stability, precision, lot-to-lot uniformity, dilution linearity, causes of interference, high-dose hook effect, and spike recovery.”

Dr. Hemken’s take-home message was that, “in order to produce robust biomarker assays, it’s important to increase analytical performance testing and stringency. The challenge is to be much more diligent in preanalytical testing, assay design selection, and reagent characterization. Only then will you be able to more confidently address the clinical utility of a biomarker.”

Thinking Outside the Plate

A new wave of multiplexing technologies is paving the way for their use not only in biomarker discovery and validation but also for a host of applications that range from diagnostics to prognostics. “Multiplexing assays provide faster, more efficient analyses that are less costly than using multiple singleplex methods,” said Paul Rhyne, Ph.D., associate director, pharmaceutical candidate optimization, Bristol-Myers Squibb (www.bms.com).

Dr. Rhyne uses several technologies for identifying and validating biomarkers. “We routinely use three platforms. Luminex’ (www.luminexcorp.com) xMAP® consisting of color-coded beads that can be coated with a reagent specific to a particular bioassay, allowing the capture and detection of up to 100 different analytes at the same time. Soon that will go to 500. This provides great flexibility because you can control the bead set to mix and match analytes.”

The other platforms Dr. Rhyne uses are plate-based multiplexing. “These are planar arrays that use antibodies as capture reagents. Meso Scale’s (www.meso-scale.com) Discovery MultiArray Technologies use a combination of electrochemiluminescent detection and patterned arrays. Another technology we use is Pierce’s (www.piecenet.com) SearchLight that quantifies up to 16 chemiluminescent or 24 infrared proteins per well. All three multiplexing platforms are sensitive and reliable, however each has different advantages and disadvantages relating to application, high throughput, automation, and equipment.”

Dr. Rhyne advised investigators to consider thinking outside the “plate” regarding multiplexing. “Applications for multiplexing are wide open. Traditionally, they were used for cytokine analysis, but now we see they can be used for much more such as validating biomarker assays and for diagnostics. For example, it is now possible to take a patient’s blood sample and use a multiplexed assay such as Luminex’ respiratory viral panel to confirm viral infection, identify the virus, and rapidly adjust the care needed. Multiplexing is a technology that is amenable to many more applications.”

Most investigators agree that in the future biomarker development and validation will become a more organized and industrialized process. Predictive biomarkers will progress from single measurement to multianalyte assays. All in all, advances should provide an improved decision tree for steadily moving biomarkers through the pipeline.

Previous articleParexel International to Purchase ClinPhone for Roughly $177M
Next articleInvitrogen and BioTek to Conduct Technology Compatability Studies