January 1, 2006 (Vol. 26, No. 1)
Working Toward Improving Hazard Identification and Risk Assessment
Adverse health effects resulting from drug toxicity reactions are invariably associated with alterations in gene expression. Toxicogenomics aims at interrogating such changes with the hope of better hazard identification and risk assessment.
Microarray technologies continue to lead the pack in toxicogenomics technologies. They, along with other strategies, are elucidating mechanisms of toxicity and generating toxicity profiles of new compounds with unprecedented precision.
DNA microarrays can measure the effect of drug candidates on the gene expression of thousands of genes simultaneously. Such profiles can provide means for cataloging a compound’s so-called “signature.”
Databases generated from such signatures can be used to assess other candidates’ corresponding biological responses to provide a clearer picture of a compound’s toxicological and pathological endpoints.
Among the speakers at Cambridge Healthtech’s “Toxicity Biomarkers” conference held recently in Philadelphia was Jing-Shan Hu, Ph.D., research leader for toxicogenomics at Roche (www.roche. com). Dr. Hu reported that the company’s genetic and genomic group used expression profiling to identify a potential gene expression signature and marker for genotoxicity that was previously unknown.
“Gene expression profiling of the effect of a number of different genotoxins in an in vitro cell line system identified 58 DNA damage response genes, which provide a composite gene expression signature indicative of genotoxicity,” said Dr. Hu.
“Expression of a novel gene with the largest increase in mRNA expression responding to DNA damage correlates well with genotoxicity for compounds tested so far. We believe that this novel gene is a potential marker, along with 58 other genes, as a signature for genotoxicity that can be assessed in vitro.”
Dr. Hu believes this may be an important strategy to assess other new candidates for genotoxicity. “Use of gene expression profiling is a means of assessing toxicity early in the life of a drug candidate and will greatly enhance the toxicity identification process.”
In Vitro Strategies
Developing in vitro systems for toxicology studies can save time and money, according to Eric Blomme, D.V.M., Ph.D., leader of the cellular and molecular toxicology group at Abbott (www.abbott.com).
“Working with in vitro systems early for toxicology profiling is important since it is expensive not only to synthesize compounds, but also to perform animal testing. In vitro systems only require compounds in the milligram and not the gram range to determine specific toxicity endpoints.
“We feel such strategies provide a valid and useful way of evaluating and selecting compounds within the discovery pipeline. Once that is accomplished, in vivo studies can be conducted with the best candidates with a higher chance of success.”
In collaboration with Iconix Pharmaceuticals (www.iconixpharm.com), Abbott developed gene expression-based screening assays to generate toxicologic profiles of its early-stage compounds. This collaboration employed Iconix’s DrugMatrix system, which catalogs genomic effects of drug and chemical treatments as well as their library of Drug Signatures.
According to Dr. Blomme, their approach involves three basic steps. “First, we select the proper dose for evaluation. A basic principle is that everything is toxic if you go high enough. We use isolated rat hepatocytes to determine the dose needed to produce slight toxicity and, therefore, a reliable readout. We then generate a gene expression profile at that concentration using DNA microarrays.
“Next, we create a large reference database of gene expression profiles induced by positive and negative control compounds for selected toxicologic endpoints, such as DNA damage, mitochondrial damage, phospholipidosis, microvesicular steatosis, and peroxisome proliferation.
“This reference database has allowed us, using various general and proprietary methods, to generate signatures composed of 2080 genes that correlate with these selected toxicologic endpoints. Understanding the performance of these assays is critical, as one wants to avoid false positive results at this stage of the discovery process.
“Finally, we transfer these signatures to more cost-effective platforms amenable to a higher throughput. Our chemists now utilize this in vitro toxicology characterization in SAR studies at the lead optimization stage to improve the toxicologic profile of their compounds.”
Deriving accurate and consistent information from volumes of microarray toxicity data is a daunting task, according to Susan Flood, Global Pharma strategist, SAS (www.sas.com). “One of the most common challenges scientists face is correlating their microarray-based toxicological response data to the generated specific biomarkers.”
Flood suggested that microarray/genomics technologies need some key refinements to transition arrays from the lab to the clinic. “Part of the problem is a lack of appropriate standardization, such as with experimental standards, data standards, and QA/QC standards.
Other problems include platform variabilities and capability to distinguish adaptive versus toxic reactions. To establish quality analysis procedures it is critical to first decipher variability.
“For example, scientists must evaluate signal-to-noise considerations. In this case, important aspects are appropriate experimental design, careful evaluation of data precision and accuracy, and appropriate statistical analysis of the signal.
“It’s also critical to standardize methods from lab to lab and platform to platform. GLP documentation must be made at each of these steps. An emerging theme in cross-platform studies is to expect and deal with considerable biological variation.”
According to Flood, to maximize the quality of toxicogenomic data, scientists must develop common standard operating procedures across all collaborations. “It’s important to minimize variables in different platforms, sample handling, hybridization, and ontologies, and to maximize throughput controls (in the arena of GLP).
“Accurate analyses require both a statistically driven approach as well as interpretation and visualization skills. Only then will biological themes appear and provide the ability to derive predictive models.”
Transcriptional Profiling
Transcript profiling measures the changes in gene expression, and, potentially, the toxicological effect, of a drug candidate via microarray technologies. According to Simon Plummer, Ph.D., senior scientist, CXR Biosciences (www.cxrbiosciences.com), the perception and uses of transcript profiling are changing.
“Several years ago this technology was perceived as a way of screening compounds for toxicity,” Dr. Plummer stated. “However, the enormous complexity of the generated data has limited its use. Our company has a newer more focused approach that uses the power of transcript profiling to understand the mechanism of action of drugs.”
Tom Shepherd, Ph.D., CEO noted, “We use an Agilent (www.agilent.com) platform with 60mer oligonucleotide arrays. cDNAs are generated from test and control mRNA samples via reverse transcription, differentially labeled, and hybridized to the microarray. This creates a signature list’ of significantly regulated genes.”
Making sense of that data comes next. “We perform clustering analyses designed to find trends in data sets by grouping together similar objects (i.e., genes, treatments),” Dr. Plummer continued. “This reveals functional relationships. We next utilize a bioinformatics approach in which we perform functional annotation and classification. Utilizing the Ingenuity (www. ingenuity.com) Pathways Analysis software, we generate an extensive pathway profile that we relate to our existing database of toxicity pathways.”
One of the important applications of CXR’s approach is identifying cell-type specific effects on genes. “A lot of mechanistic information is generated in vivo on whole organs,” said Dr. Plummer.
“While this provides a great deal of heterogeneous information, it doesn’t profile specific areas of tissue. We analyze gene expression using laser capture microdissection. This technology permits isolation of small amounts of specific tissue. We analyze by microarrays, compare gene signatures, and identify cell-type-specific effects on target genes.”
Other applications cited by Dr. Plummer include “re-assessing or rescuing a compound that has run into difficulty in preclinical work by assessing the relevance of toxicological mechanisms in animal models to man. Another use is separating on- and off-target drug effects. Overall, we’re finding that transcript profiling is a powerful but evolving technology whose potential has not yet been fully realized.”
Unmasking Biomarkers
The ability to measure biological signals in patient samples may accurately assess or, even better, predict drug-induced toxicity. Thus, the identification and validation of sensitive and specific toxicity biomarkers not only may define mechanisms involved in toxicity, but also improve risk assessment, a fundamental process in drug development.
Unfortunately, many biomarkers used in preclinical and clinical assessments of drugs are either buried within the protein repertoire of the sample or are non-specific.
Nerviano Medical Sciences (www.nervianoms.com) developed a novel plasma proteomic approach to identify specific biomarkers of toxicity. Miro Venturi, Ph.D., head of the investigative sciences unit within preclinical development reported, “Our organization’s main objective is to identify biomarkers, validate those with respect to human relevance, and correlate this with mouse models.
“These same biomarkers are used for in vitro screening. To do this, biomarkers must be detectable in biological fluids, amenable to translation from mouse to human, and quantitatively measurable. There are many irrelevant proteins in plasma that are not critical biomarkers including albumin, immunoglobulins, and transferrin.”
“Because of this, proteomic analyses can’t easily identify plasma markers by usual 2-D polyacrylamide gel electrophoresis coupled to mass spectroscopy (2D-PAGE/ MS). We developed a method in which we use an immunoaffinity depletion step performed with antibodies raised in chicken (IgY) that are directed against such irrelevant proteins. We immunodeplete these from plasma or serum and then assess residual proteins in the sample.
“We can gain a clearer picture of the less abundant marker proteins in this way. For example, in a mouse model in which we induced hepatotoxicity with lipopolysaccharide, we were able to detect tumor necrosis factor-alpha by this approach (coupled to 2D-PAGE with MALDI-TOF mass spectrometry).
“This cytokine is usually only detectable via sensitive immunoassays. Thus, our single-step approach allows easier and more accurate monitoring and unmasking of biomarkers previously hidden in a biological sample.”
Dr. Venturi noted that the company currently is monitoring approximately 1,0003,000 biomarkers in fluids, focusing their efforts on quality and validation rather than quantity. These also include proteins bearing post-translational modifications such as phosphorylation and glycosylation.
Computational Modeling
Harnessing the power of computational modeling of biological pathways can help decipher molecular mechanisms of toxicity, according to Gordana Apic, Ph.D., CEO, Cambridge Cell Networks (www. cambridgecellnetworks. com).
“One of the biggest challenges in drug discovery today is understanding the molecular mechanisms of drug actions and mechanistic toxicity. Our company provides a series of products for elucidating mechanisms of toxicity. We use a multidisciplinary approach integrating bioinformatics, genomics and proteomics, chemistry, text mining, and mathematics.”
The company provides databases and core softwares. For example, they offer a manually curated database of more than 25,000 chemical structures (World-of-Chemistry or WOCBase).
“This database also contains chemical-to-protein relationships enabling easy identification of genes and proteins responsible for the toxic effects,” Dr. Apic noted. “We also have manually curated data on biological interactions for more than 500 human pathways including toxicity-specific ones.
“For investigators using other model organisms, we have equivalents to human biological pathways for more than 10 model organisms (e.g., mouse, rat, dog, zebra fish, fly, chicken) that are relevant to toxicity and safety studies. This enables fast cross-species translation of toxic effects. Another database provides toxicological end points and profiles for chemicals.”
A series of core software products offered by Cambridge Cell Networks helps investigators probe gene and protein interactions. “Our text-mining software (Smartex) gathers textural information for searching our curated databases. For example, one can type in liver hyperplasia’ and go straight to genes/proteins, pathways and chemicals that have been observed to play a role in causing liver hyperplasia,” says Dr. Apic.
“Another advantage is in so-called drug rescue in which if efficacy fails for the first target, a company can explore other pathways that the therapeutic may impact in order to find another application for it.”
According to Dr. Apic, Cambridge Cell Networks’ products may assist in a number of areas such as “safety assessment of chemicals in industries such as pharmaceuticals, chemicals, pesticides/herbicides, and for personal-care products. They are also useful for finding toxicity or other general biomarkers, for DNA microarray and proteomics data analysis, and for mining a client’s own text data.”