June 15, 2009 (Vol. 29, No. 12)
Enabling Technologies Are Now Available, but Careful Testing Is Advised Before Adoption
The quest for highly predictive biomarkers is as enthralling as the medieval quest for the Holy Grail and, potentially, more rewarding for patients throughout the globe. The current challenge isn’t to identify potential biomarkers, but to validate them, ascertaining that they are, in fact, accurate and predictive for certain conditions.
Virtually every biotech company engaged in drug development has some biomarker program under way, either by itself or through partnerships or other ventures. Some organizations have even formed consortiums to advance the research beyond what any single organization could achieve. Many of these strategies were outlined at CHI’s “Biomarker World Congress” held last month in Philadelphia.
The Critical Path Institute, for one, is developing innovative collaborations in research and education to accelerate medical product development, with the goal of developing new tools and methods to modernize the scientific process in support of the FDA’s Critical Path Initiative, according to Maryellen de Mars, Ph.D., director of the organization’s Maryland office.
Currently, in its clinical biomarker program, C-Path is working with the NIH on two Phase III trials for cancer and for heart disease. “C-Path has pulled together subject matter experts, regulators, diagnostic companies, and other stakeholders as needed to qualify biomarkers for improved clinical decision-making and outcomes,” she said.
C-Path’s cancer program focuses on validating EGFR as a biomarker for response to tyrosine kinase inhibitors (a class of drugs that Dr. de Mars said is effective in only 20–30% of patients). “There is evidence,” she added, “that EGFR status—the copy number, expression level, and mutation status, for example—could be indicative of response to treatment.” The purpose of the ongoing NCI MARVEL clinical trials, Dr. de Mars explained, is to validate EGFR as a biomarker for response to the tyrosine kinase inhibitor erlotinib. “The information from this effort can be used to qualify and validate other cancer biomarkers for targeted therapy.”
C-Path’s cardiovascular biomarker program involves improving the safety of warfarin, which has serious side effects. Currently dosage is determined using only clinical factors, and is monitored with a blood test, “but there’s a lot of individual variation, so it is easy to over- or underdose patients,” Dr. de Mars said.
“Variations in two genes—CYP2C9 and VKORC1—account for up to 50% of individual variability in results seen in warfarin responses. The question remains as to whether genotypic information can reduce adverse events and improve outcome for those patients.” C-Path is coordinating the diagnostics aspects of the NHLBI clinical trials, which was slated to begin in May to evaluate the benefit of routine genetyping for warfarin patients.
In a similar approach, the International Serious Adverse Events Consortium (SAEC) is working with the FDA to look into the genetic basis of certain types of adverse drug events and the use of biomarkers to better predict a patient’s risk of developing those conditions. The resulting information will be publicly available. The SAEC is focusing upon drug-induced liver damage and drug-induced skin rashes, such as Stevens-Johnson syndrome and toxic epidermal necrolysis.
SAEC also recently began a collaboration with St. George’s University of London and the Drug-Induced Arrhythmia Risk Evaluation Network to identify genetic markers to predict the risk of drug-induced torsades de pointes (a ventricular arrhythmia).
“Generally, genomics hasn’t panned out the way we thought it would,” explained Arthur Holden, Ph.D., founder and chairman, “because it’s very complicated.” Therefore, the consortium decided to focus on relatively rare events that yield highly specific phenotypes. The hope is that specificity may have implications for other programs.
Working with GlaxoSmithKline (GSK), SAEC has identified many genetic associations that seem to contribute to an individual’s risk of developing serious, drug-induced skin reactions. Other research has identified biomarkers for liver injury and other adverse events. The work found, for example, that a gene on chromosome 6 (HLA B5701) is highly predictive of liver injury when the drug Abacavir, a nucleoside analog reverse transcriptase inhibitor developed by GSK to treat AIDS, is administered, Dr. Holden said. Other associations on chromosome 6 are also predictive for adverse events associated with the GSK antibacterial therapy Augmentin, he said.
The researchers worked with only a small cohort of Northern Europeans, so more data is needed from larger, more ethnically diverse cohorts before that information can be used clinically, Dr. Holden pointed out. That said, he expects to release a final report on this aspect of drug-induced liver events within the next few months.
Michael Burczynski, Ph.D., associate director of the biomarker lab at Wyeth Pharmaceuticals, considers it critical that emerging technologies be characterized before biomarkers are developed based upon those technologies. The premise, Dr. Burczynski said, is that novel technologies are continually evolving and that a thorough understanding of the molecular principles by which they work, as well as their limitations, can affect the ability to identify and/or adequately measure biomarkers.
New technologies can be exciting and are helping to identify a wide range of new biomarkers. Researchers who are eager to reap the benefits of new technologies “are not always able to perform sufficient due-diligence testing to fully characterize the new technologies,” he said. Yet, putting the technology through its paces is critical, and “you have to set aside time to do it.”
The reason this is so important, Dr. Burczynski explained, is that “there can be a disconnect between the actual performance and that claimed by developers or understood by researchers. Hype surrounding an advance can lead to unrealistic expectations,” regarding how to use the technology, how to analyze the results, and how to use that information to make decisions. “If you don’t understand the limitations of the new technology, how can you interpret the data correctly?” he asked.
A key example comes from the early days of bioinformatics, when some of the algorithms used in the analysis of biological data had been developed from business applications. Although the results of the tests were valid, the analyses were not. By not understanding the nature of the algorithm used and its limitations, some researchers made erroneous conclusions.
A more successful example is provided by Dr. Burczynski’s own experience with the PAXgene tubes from PreAnalytiX, a joint venture between BD and Qiagen. “We were really excited about what the tubes could enable,” he recounted. When his lab took the time to evaluate the PAXgene tubes to understand their capabilities and limitations, they “turned out to be as enabling as we had believed,” but also had some key restrictions regarding their use. To use it to the best advantage, it was important to step back and understand its realistic potential as well as its constraints.
There are no guidelines for characterizing emerging technology prior to biomarker development. “Due diligence applies to innovative reagents as well as endpoint-detection instruments,” Dr. Burczynski stressed, “so you can’t come up with a standard set of guidelines.” Instead, the technology and its intended uses will determine how due diligence will be conducted, and the process should be company-specific, he said. The main challenge is simply carving out the time in which to characterize new technologies.
A similar need for analysis applies to the biomarkers themselves. M. Walid Qoronfleh, Ph.D., vp of business development at NextGen Sciences, explained at the AACR conference, that although the ability to generate biomarker candidates is growing steadily, the ability to develop assays to validate these putative biomarkers and move the most viable candidates forward has become a major bottleneck.
NextGen is addressing that bottleneck by using multiplexed mass spectrometry-based assays to shorten assay development time and thereby allow resources to be focused on the use of the biomarker rather than upon assay development.
The focus is on developing fit-for-purpose assay validation for biomarkers. “Depending upon the application, the level of bioanalytical assay validation may be tailored so that it is fit for a given purpose. In early biomarker discovery stages, minimal validation is required, but as the usefulness of a single biomarker or a panel of multiple biomarkers is more clearly defined, the level of assay validation increases,” noted Richard C. Jones, Ph.D., head of mass spectrometry at NextGen.
Validation matters at two levels—matching users’ needs to the use, and validating the assays to the standards of the FDA or other regulatory bodies. “There also are two levels of validation—validating the assay and validating the biomarker,” Dr. Qoronfleh said. Sometimes, one will validate the other. Oftentimes, however, a particular protein is used as a validation assay.
“That doesn’t mean that the protein is a valid biomarker,” he emphasized, ascertaining that validity depends upon other factors, including having the right sample and even the right sample size. For example, a cohort of five to ten patients is too small, he said, so results aren’t statistically significant. “You may need a large cohort. That’s where fit-for-purpose concepts come into play.”
Alzheimer’s disease offers a good case study. There is a sizable list of proteins involved in the disease, so the list of putative biomarkers for Alzheimer’s is also large. “We developed multiplexed assays for all those proteins,” Dr. Qoronfleh said, and then validated those assays to determine whether they were fit-for-purpose. The purpose, in this case, was to select patients most likely to respond to clinical trials involving particular medications for Alzheimer’s disease.
Different parameters would have been in play had the assay been developed for a different purpose, such as identifying those likely to develop Alzheimer’s disease. The results of such trials may, eventually, better match patients and therapies, helping to ensure that patients are prescribed the therapies most promising for them early in their treatment. This would eliminate much trial and error, as well as potential side effects from less tightly targeted therapies.
To aid in biomarker development, Bio-Rad Laboratories and Bruker Daltonics are working together to develop the Lucid Proteomics System for high-throughput protein profiling and identification. By combining Bio-Rad’s surface-enhanced laser desorption/ionization (SELDI) technology and Bruker’s MALDI-TOF/TOF mass spectrometers, “researchers can perform top-down and bottom-up discovery on the same system and have more information about the biomarkers,” Yves Courties, senior product manager at Bio-Rad, explained.
That is possible because the system does not require the protein to be digested prior to profiling, enabling the detection and identification of post-translationally modified or truncated proteins and peptides, he added.
Bio-Rad recently introduced the first product in that family, the Lucid ID Access Pack, which provides SELDI accessories and consumables to identify native protein and peptide biomarkers using Bruker’s AutoFlex and UltraFlex MALDI-TOF/TOF systems. According to Bio-Rad, the product provides specific solutions for characterizing intact proteins and peptides under 30 kD, which are challenging for currently available technologies to identify.