August 1, 2010 (Vol. 30, No. 14)

Elizabeth Lipp

Early Use Can Mitigate Some Risks and Increase Likelihood of Compound Advancement

As a builder asks which tools are most beneficial, so do scientists as they work to create compounds to send into the clinic. Finding the protein and pathway targets of interest are only part of the puzzle; the means to find these are only as good as the tools scientists can employ. 

“Companies don’t often invest in biomarkers until it is too late,” noted Mark Parrish, senior manager of assay development at Covance. “Scientists need to think broadly—cast a wide net, understand your sample, and understand your platform.”

“Biomarker research has seen an exponential increase in use over the last decade and this will continue,” added Richard Houghton, principal scientist, bioanalytical science at Quotient Bioresearch. “Biomarkers will be used earlier in the drug discovery process to demonstrate proof of mechanism mitigating some of the development risks and ensuring that only those demonstrating potential efficacy are taken into Phase I trials.”

At CHI’s “Biomarker World” conference, speakers addressed the changing role of biomarkers in research, how technology is accommodating that role, and the implications both have for drug discovery.

New Engines for Data Analysis

Data analysis remains a perpetual pain point in biomarker research. Ilya Mazo, Ph.D., president at Ariadne, noted that high-throughput data-generation methodologies such as microarray gene expression, require new approaches for gathering information.

“Ariadne’s focus is on proprietary linguistic algorithms that help people access more information about biological pathways and systems. Our technology helps scientists make informed decisions about what biomarker to pursue.” The company believes that the computational approaches used for high-throughput data analysis require that the biological information from literature is a coherent and integrated part of the analysis software itself. 

To that end, Ariadne applies its MedScan technology to produce the ResNet (mammalian and plant) and ChemEffect (drug-centric) knowledge databases by harvesting knowledge from literature. The databases can be supplemented by users with any other type of information including in-house documents and third-party data sources. Once captured, this information is transformed into biological relationships and stored for use in hypothesis testing and verification, mechanistic modeling, and drug and patient stratification strategies.

At the meeting, Dr. Mazo presented a case study demonstrating the use of MedScan in biomarker research. “There is a strong case in the data to boost identification of molecularly defined markers in classes of patients. Using MedScan, we compiled a knowledge database from scientific literature to hypothesize a mechanism behind fibromyalgia (FM).

“Researchers performed genotyping by using Algynomics’ pain research panel, a chip-based platform that assays 3,295 SNPs representing 350 candidate genes for pain sensitivity, inflammation, and effect. The results from the association tests were then analyzed with Pathway Studio, which identified cellular pathways both common to FM as well as distinct between clusters.”

According to Dr. Mazo, the data generated provides evidence that clinical phenotypic subgroups may be underlined by separate but specific cellular pathways, providing rationale for individualized treatment of patients with FM.

“What we observed requires further study, but we made some promising discoveries. There is still a lot of opportunity out there. I believe we can help people by leveraging this knowledge with informatics to provide answers.”

Genomic Considerations

The Covance Genomics Laboratory operates out of Rosetta’s former gene-expression lab and is expanding into the biomarker space. The company recently launched a new discovery and translational services group, which will integrate its discovery services, antibody products and immunology services, Biomarker Center of Excellence, and the Genomics Laboratory.

Parrish said that genomics and protein biomarkers are complementary approaches. “You can certainly do one without the other, but you may find the protein biomarkers by way of the genomics. Proteomics is more specific. You don’t get a sense of how it works in the bigger picture unless you get a much better sense of how a compound will react in the pathways and systems.” Having diverse applications, robust platforms, and an innovative biomarker strategy is crucial to discovery, he added.

Covance is attempting to become “an end-to-end company. In June, we opened a state-of-the-art biorepository facility at our Greenfield, IN location. The 20,000-sq-ft building is dedicated to long-term storage of clinical trial specimens and can store a wide range of specimens, including plasma, serum, whole blood, DNA, PBMC, and tissue. We’re integrating across sites so that chemists can send samples to one place. We’re trying to set this up so that it’s simple and streamlined for the client.”


The Covance Genomics Laboratory uses a range of technologies to assess gene expression and sequence variation in preclinical and clinical programs, including Illumina’s Duo SNP Array. Its primary service areas include gene-expression profiling, next generation and targeted sequencing, and genome-wide and targeted genotyping.

Quantification Using LC-MS/MS

Dr. Houghton and his co-workers have developed an LC-MS/MS method for the measurement of seven steroids in human urine and validated for use in a regulated bioanalytical arena as a biomarker for 11b-hydroxy steroid dehydrogenase enzyme activity. This project drew upon a surrogate matrix calibration strategy for the quantification of endogenous analytes.

“Data generated using this method in support of two Phase I studies, with two different drugs, showed the power of pharmacodynamic data in supporting proof of concept, drug potency and kinetics, and evaluation of food-effect, said Dr. Houghton.

While there is a particular focus on protein and peptides as pharmacodynamic biomarkers of therapeutic effect or disease indicators, in some instances, there may be endogenous small molecules that are appropriate, he reported. 

“By measuring changes in the ratio of these endogenous metabolites, all within a single multiplex LC-MS/MS assay, it was possible to both prove the efficacy of the drug, as well as the potency. Steroids, by their very nature and diverse action around the body, may prove in some disease state to be excellent biomarkers and are relatively cheap and easy to measure.”

One of the main challenges in the field, Dr. Houghton noted, is regulatory acceptance of biomarkers with clear guidance on validation and application in clinical trials and as diagnostics. “The FDA has recognized and is actively supporting this process, which should lead to more rapid acceptance of specific markers to particular disease states in the future.

“The pharmaceutical industry needs to continue to work hand-in-hand with regulators to ensure biomarkers become an integral part of the drug-development process. However, regulators around the world also need to recognize the cost implications to pharma companies of companion diagnostics and ensure there is adequate return-on-investment opportunity for drug companies for the added R&D spend.”

Schizophrenia

Pharmacodynamic biomarkers are critically important in drug development to examine whether drugs are modulating their intended therapeutic targets or pathways, observed Jude O’Donnell, preclinical biomarker program team leader at Almac Diagnostics. She presented preliminary data demonstrating the potential of using peripheral blood in an ex vivo model system to discover potential pharmacodynamic biomarker candidates for an atypical antipsychotic therapeutic used in the treatment of schizophrenia.

“This study demonstrated the potential of using blood to quantitatively assess the exposure of a schizophrenia patient to an atypical antipsychotic regimen,” said Dr. O’Donnell. “This is of particular relevance for neurological disorders such as schizophrenia, where access to the primary tissue is difficult.” Biomarker research is becoming increasingly focused upon discovery within peripheral tissues such as blood, she added. “The ideal biomarker should be detected in a surrogate tissue thus avoiding the need to perform invasive tests such as biopsy.

“As effective determination of a biomarker can require taking multiple samples at multiple time points, the ease of peripheral blood collection offers obvious advantages over other tissues with respect to longitudinal sampling. Blood offers a  wealth of material appropriate for biomarker research, ranging from circulating tumor cells, serum markers, and circulating nucleic acids.”

From a drug-development perspective, Dr. O’Donnell noted that the modern era of molecular therapeutics poses considerable challenges as the targeted nature of these new agents often reduces toxicities and renders assessment of adequate/optimal exposure difficult. “Often the biologically effective dose of novel agents is considerably less than the toxic dose.

“Escalating to just below toxicity can lead to drug failure in clinical trials as this increases the incidence of adverse events. Consequently, there is increasing emphasis on biomarker research at the preclinical and early clinical trial stages to provide a robust means of determining drug dose and effect.” Preclinical ex vivo modeling facilitates a time- and cost-effective means of assessing the feasibility of discovering a biomarker, pharmacodynamic or otherwise, in a peripheral tissue source such as blood, at an early stage in drug discovery and development.


Almac Diagnostics is focused on the discovery and development of biomarkers for its own research as well as for clients for whom it provides a range of translational genomic services.

Subproteome Enrichment

Proteome complexity hampers protein biomarker identification, noted John C. Rogers, Ph.D., manager of mass spectrometry reagents at Thermo Fisher Scientific. “As we looked at what was problematic in biomarker research, we also wanted to differentiate with good science. Where are the pain points? Sample prep and data analysis are the leading culprits. We want to give scientists the right tools for proteomics success.”

Thermo Scientific cysTMT Mass Tag Reagents are thiol-reactive isobaric tags for labeling reduced cysteine residues to perform multiplex quantitative mass spectrometry analysis of protein expression.

When used in combination with the Immobilized Anti-TMT Antibody Resin, cysTMT tags reportedly allow specific labeling, enrichment, and quantitation of cysteine-containing peptides and proteins from complex mixtures of protein samples. Because of the relative proportion of cysteine residues in protein mixtures, it is a beneficial strategy to reduce sample complexity before MS analysis.

For TMT-labeled proteins and peptides, this can be accomplished using an immobilized anti-TMT monoclonal antibody that has high affinity (Kd<100 pM) for the reporter region of the TMT Reagent.

“Enrichment of labeled peptides by this method results in fewer peptides identified per protein but can significantly increase the number of unique proteins identified, making identification and quantitation of low abundance proteins easier,” said Dr. Rogers. “In addition to improving the dynamic range of sample analysis, cysTMT reagents can also be used to assess the oxidation or S-nitrosylation state of cysteine residues.

“When you look at the three things everyone is trying to optimize—sensitivity, scalability, and comprehensiveness—you have to understand that you can’t have all three. There is a trade-off. We choose to optimize for sensitivity and relevance during discovery, then sensitivity and scalability during assay development and validation. We are pushing for relevant proteins.”

Previous articleFormatech to Donate Fill/Finish Services to Femta for One Lot of Investigational mAb
Next articlePROTEIN IMPORTANT IN DIABETES FOUND TO PLAY ROLE IN OTHER DISEASES