|Send to printer »|
Feature Articles : Jan 1, 2009 ( )
High-Content Screening Rapidly Evolving
Researchers Are Currently Focused on Assay Development and Data Analysis!--h2>
Advances in high-content screening (HCS) instrumentation have significantly increased throughput and improved ease of use such that the technology is moving from its traditional role in secondary screening to the earlier stages of the drug-discovery process. At the same time, the process bottleneck has shifted from screening to analytics. The challenges inherent in the technology including data capture, storage, and multivariate data analysis will be discussed at Cambridge HealthTech’s “High-Content Screening” meeting to be held later this month in San Francisco.
A handful of presenters spoke to GEN prior to the assembly. Among them was Ann Hoffman, senior principal scientist at Roche Discovery Technologies, who will discuss assay development in the “Setting up an HCA Lab” workshop at the conference. Specifically she will talk about her experiences with Columbus™, the new enterprise system software from PerkinElmer, currently being evaluated by several of Roche’s core screening labs using images and data captured on the Opera system.
“High-content screening instrumentation and software has evolved to the point that the current focus for researchers is on assay development and data analysis,” Hoffman explains. “The current challenges in the lab are how to develop robust assays to measure cellular processes in new cell types, including cardiomyocytes, primary cells, T cells, and B cells. In addition, developing new probes and sensors to follow complex cellular processes and identify key proteins in the signal transduction pathways of interest are challenges. The constant push is toward more quantitative measurements that enable a better understanding of the disease process.”
Most HCS labs today have multiple instruments, some of which are dedicated to particular assay protocols. HCS has now been pushed upstream in the drug discovery and development process to be fully integrated into most primary screens, particularly those where imaging is essential to query the antigens of interest. It has also been pushed downstream for applications such as predictive tox profiling and other forms of compound profiling.
While the progress has been measur able, it has not yet yielded new marketed drugs. Scientists are contributing to the cellular knowledge base that is improving drug candidate prioritization earlier in the drug-development process. Further, there is a growing trend toward combinatorial analysis, for example, by combining HCS with genomic protocols and tools (e.g., siRNA). Scientists have been able to gather breakthrough results that have increased the predictive value in drug discovery.
The development of predictive in vitro toxicology tools for use in high content screening, a focus at Millipore, has become a high growth area for this reagent provider. Following up on their kits for hepatotoxicity, cell cycle analysis, and DNA damage, in December, Millipore launched its next set of kits focusing on neurotoxicity. At the HCS meeting, Stella Redpath, Ph.D., group product manager, will be presenting case-study data based on the use of these recently launched kits for monitoring neurotoxicity in using HCS.
The kits provide a complete solution with reagents, antibodies, and a control set of neurotoxins, and are suitable for all high-content instrumentation on the market, according to Dr. Redpath. The kit reagents can also be used for sample detection on confocal or conventional fluorescence microscopy.
“The assay has been successfully applied to neural cell growth using human, rat, and mouse cell types with primary cells and cell lines. We also have developed co-culture conditions for the use of astrocytes in culture with neurons.”
“In addition, we’ve developed a differentiated cellular model based on PC12 rat cells that have been treated with neural growth factor for six days, as well as the co-culture assays for neurons and astrocytes. This cellular model is sensitive to neural toxins as measured by their altered profile of neural-specific biomarkers: bIII tubulin, synaptophysin, and glial fibrillary acidic protein.”
The three biomarkers are the best predictors of neurotoxicity available today, Dr. Redpath insists. Her group has found that the combined profile of these markers is informative, and they provide an earlier indication of neurotoxicity than can be seen by just looking for morphological changes in the cells including neuronal damage, astrocyte hypertrophy, gliosis, and synaptotoxicity.
“The kits may be more sensitive than traditional assays such as MTT or LDH assays, which rely on late-stage cytotoxic endpoints. Our new kits can detect neurotoxic effects at lower concentrations than these biochemical assays. Further, the profiles correlate highly with the results of morphological studies including neurite outgrowth as reported in the literature.”
Stephan Heyse, GM of screening informatics at Genedata, will focus on the current point of pain for screeners—the systemic processing, analysis, and management of HCS data. According to Heyse, the focus of data analysis has broadened beyond image analysis to the extraction of full sets of numeric data from multiple parameters. Genedata has made strides to enable this kind of analysis by helping researchers set the framework by which data is captured and processed via standard workflows, Heyse explains.
“This is an iterative process where outcomes often can’t be predetermined, nor should they be,” Heyse notes. “We see unexpected compound effects all the time, which provide us with new inroads for how to ask the right questions in the next set of experiments or in a re-analysis of this data.
“For example, when neurons are treated with a set of compounds, neurite outgrowth may occur. What’s the key outcome from this stimulus? Is this stimulus better quantified by determining neurite length or branching? When, in the process does the branching occur? Do the neurons branch and then die, pointing to a side-effect of treatment? Such new insights come from pursuing the iterative experiment—analysis process to completion.”
The Screener platform from Genedata integrates with software packages from the HCS instruments on the market, providing an information-rich bidirectional bridge between imaging platforms enabling data integration and a backward review of original data and images associated with a well or compound, he adds. The Genedata framework for analysis is applicable for both HCS and HTS, ultimately bringing together high-quality results obtained from multiple screening technologies.
At the conference, Dr. Heyse will be providing case studies that highlight best-practice workflows embodied in the Genedata Screener platform to capture data from instruments, process it in efficient ways, and interactively analyze the wealth of information from high-content screens.
Marjo Simonen, Ph.D., lab head of the Novartis HCS facility in Basel, Switzerland, will provide insight on the end-user’s perspective. Her lab provides HCS screens for all the therapeutic research groups in the Basel site, including the oncology, neurobiology, and respiratory groups, wherever imaging data is an essential output for the screen.
“We have been using HCS for compound screening since 2005 and are currently processing up to 250,000 compounds per screen,” says Dr. Simonen. “We use the InCell Analyzer 3000 for all our screens, and we’ll soon be adding a robotics system to allow for unattended screening work overnight and over the weekend to increase our throughput. We are hopeful that we’ll be able to do full library screens (>1 M compounds) in 2009.”
This is not to say that Novartis has moved away from HTS—the company still performs biochemical and cell-based assay screens for many primary screens, but when imaging is the essential output needed by the therapeutic group, HCS is applied.
“We perform the primary screen usually with singletons, then we validate the hits with a concentration response using quadruplicate samples. Usually we also perform counterscreens and secondary screens; sometimes only secondary screens for biochemical or other cellular screens,” Dr. Simonen adds. “We provide our customers analyzed data, IC50, max activity, not raw data. The analysis of HCS data is more laborious than that of other screening data due to the several readouts. We have in-house software to analyze and visualize the numerical data. In addition, we just bought an Opera reader, which accepts 1,536-well plates. We hope to be able to run full screenswith this.”
At Novartis, HCS is also used both in the upstream and downstream steps of the drug discovery process for target finding, compound profiling, SAR, and further understanding of the mode of action of a drug candidate.
At the meeting, Dr. Simonen will present a case study on the development of an imaging assay to screen for inhibitors of copy-number variation (CMV) entry to normal human fibroblasts and human retinal pigmented epithelial cells. The assay exploits a genetically modified virus expressing a green fluorescent fusion protein (GFP) as part of its virion tegument. Upon successful entry, the GFP-tagged tegument protein translocates to the nucleus, permitting monitoring of virus infection. The assay monitors localization of the GFP tag to the nucleus and granularity of the cells as a measure of compound toxicity. The goal is to identify compounds that block translocation of the tag without inducing toxicity in the cells.
© 2016 Genetic Engineering & Biotechnology News, All Rights Reserved