October 15, 2009 (Vol. 29, No. 18)

Vicki Glaser Writer GEN

Advances in Methodology Yield Functional and Mechanistic Data Earlier in Discovery Process

High-content screening (HCS) and the technology to do it faster, on more compounds in a shorter period of time, and to generate quantitative, multiparametric data took center stage at CHI’s “High Content East” meeting held in Boston last month. Presenters described how they are implementing enhanced screening systems, image-analysis methods, and data-management strategies to achieve daily HCS runs on tens of thousands of wells and screening campaigns totaling 200,000 to 3 million wells.

High throughput HCS—albeit not yet reaching the numbers common for conventional high-throughput screening (HTS) and with lingering limitations and challenges related to live-cell imaging over time—is making its mark and being used to probe the biological basis of disease and to detect even subtle phenotypic changes in response to experimental compounds.

Determining whether a cell looks like a cancer cell, for example, typically requires being able to detect subtle morphological changes, such as small alterations in size or structure, changes in the connections a cell makes with neighboring cells, or variations in the texture of staining. These have, historically, been mainly qualitative parameters detected by studying and comparing images of cells.

In her talk at the conference, Anne Carpenter, Ph.D., director of the imaging platform at the Broad Institute of Harvard University and MIT, presented her group’s work using HCS and image analysis to quantify difficult phenotypes and differentiate disease states such as leukemia.

Not only do HCS systems and image-analysis software automate the screening process, enabling theanalysis of many more cells in less time and increasing the chances of detecting even small numbers of altered cells, they can also utilize algorithms that evaluate defined combinations of parameters in a quantifiable manner and apply techniques to distinguish between clumping or closely juxtaposed cells. Relying on computer-based image analysis also standardizes the process, eliminating factors such as variability in human expertise and experience, consistency, and fatigue.

Dr. Carpenter’s group uses machine-learning methods to train image-analysis software to identify subtle phenotypic changes. Biologists work with the software in an iterative fashion in a process called supervised machine learning. They teach and correct the computers on a series of test images, refining the system’s knowledge base in a process that typically takes less than a day. The group developed the algorithms used by the biologists and has made them available as open-source software.

A recent paper published in PNAS by T. R. Jones, et al., documents the use of a trained image-analysis system to discriminate 15 different cellular phenotypes. Other projects involve teaching the software to discriminate leukemic from normal cells, to identify liver cells that are growing normally in culture—to aid in the development of physiologic models of liver function for use in drug testing—and training computers to detect subtle changes that signal the initiation of cell division for studying cell-cycle regulation in cancer.

Neil Carragher, Ph.D., senior scientist in the advanced science and technology laboratory at AstraZeneca, described how the company is applying high-content and live-cell imaging techniques and integrating the results with data derived from in vivo imaging and proteomic studies to improve clinical predictability.

Dr. Carragher’s group combines the results of high-content in vitro and in vivo assays to generate mechanistic information about phenotypic responses on candidate therapeutic compounds. The goal is to create a multiparametric fingerprint of a phenotype from images generated by HCS and to use this knowledge to enhance predictions of efficacy and toxicity early in drug discovery and reduce attrition later in development.

The phenotypic signatures are based on measurements of approximately 150 different parameters per cell for each assay. Data from multiple assays is collated for every test compound and compared with data obtained using well-characterized reference compounds to generate mechanistic hypotheses.

Only recently has open-source and commercial software become available “that allows you to quantitate more complex phenotypes, subtle changes, and heterogeneous responses from images,” Dr. Carragher said.

His group is employing two main approaches—each with different advantages and limitations. The first strategy relies on Definiens’ Cognition Network Technology™ software that allows users to develop algorithms that capture, computationally, what researchers can see visually. “It is very much context-based” and identifies objects based on how they are related to others in the image, rather than as individual pixels, explained Dr. Carragher. The in-house algorithm-development process depends on iterative programming steps. The other approach involves machine-learning tools using software such as the CellProfiler developed at the Broad Institute.


Zebrafish expressing nitroreductase in pancreatic islets: stained for islet cells (yellow) and exocrine pancreas (red) (GE Healthcare/Steven Chen, UCSF)

Redirecting Approved Drugs

Identifying new applications for FDA-approved drugs using HCS and image-based systems biology is the focus of work being done by Stephen Wong, Ph.D., founding director of the bioinformatics and biomedical engineering program and the cellular and tissue microscopy core at the Methodist Hospital Research Institute and professor of radiology and neurosciences at Weill Cornell Medical College.

Dr. Wong gave examples of screening campaigns to decipher targets in the pathways responsible for the metastasis of breast cancer to the brain in his talk. He specifically described the computational tools his group is developing for high-content and network analysis, and the animal-imaging techniques being used to evaluate combinations of small molecule chemotherapeutic agents for their ability to cross the blood-brain barrier and to have an effect against central nervous system metastases in breast cancer.

Dr. Wong’s group has also developed a series of quantitative image-analysis tools, including zebrafish image quantifier (ZFIQ), as well as software for studying neuronal spines (NeuronIQ), neurites (Neurite IQ), and time-lapse mitotic events in cells (DCellIQ). Dr. Wong’s HCS/systems biology research is funded by the NCI, NIA, and NLM.

Because the compounds being studied are already approved drugs, Phase I trials are not needed. The quantitative data generated from HCS provides the evidence necessary for moving into Phase II studies, shortening the drug-development cycle to a year or less.

The types of studies essential to Dr. Wong’s efforts, such as assays to monitor cell-cycle regulation or dendritic spine dynamics, require time-lapse, live-cell imaging. Looking at fixed cells provides only an artificial snapshot of where cells are at a particular point in time, explained Dr.  Wong. “We want to look at a 384-well plate of continuously growing cells over five to six days,” he said, and in his view none of the instrument manufacturers competing in the HCS market has yet to provide a robust, incubator-based, environmentally controlled system that can achieve this.

Vendors have tended to view HCS as just another type of high-throughput screening, but live-cell imaging done in as natural an environment as possible has quite different requirements, contended Dr. Wong.

“Vendors are going in the wrong direction. The power of HCS is in the ability to visualize things in action and to extract lots more quantitative information from the images. If you, instead, retrofit HCS to HTS, you are losing its advantages,” such as the ability to see cells or spines change over time, to visualize cell-cell interactions, and to sync cell populations and study cell-cycle events in time-lapse, said Dr. Wong.

In any experiment, “if you generate enough data you will get hits, but how many will be real hits versus false positives?” asked Dr. Wong. “We need to push the quality upfront on the biology side” and screen out, earlier in the discovery process, compounds that are destined to fail.

Researchers at Pfizer are using HCS to study the genetic variation and physiologic interactions that underlie hepatic insulin resistance in type 2 diabetes and the prediabetic state. Diabetes is a complex, multigenic disease, and while advances in genomic and SNP-based technologies have led to the identification of at least 30 genes that contribute to the diabetic phenotype, much work remains to understand their role in cell biology and disease and how they interact.

“If you are careful about the cell models you choose, you can use HCS to characterize these genes and monitor their effects on biochemical pathways,” said Steven Haney, Ph.D., associate fellow in biological profiling at Pfizer’s biotherapeutics and bioinnovation center. The company has invested heavily in developing cell models that are representative of human physiology, including hepatocytes that faithfully mimic liver function when grown in culture.

The other main aspect of this research effort involves identifying changes that affect the diabetic phenotype, specifically glucose storage and utilization pathways,  and distinguishing between effects that involve the insulin-signaling pathway from more general phenomena related to activation of toxicologic or stress pathways.

“HCS can alert us to things we don’t necessarily know to look for, in a mechanism-independent way,” said Dr. Haney. “The increasing throughput of HCS allows us to look at a lot of cells and determine whether subtle phenotypic changes are significant or spurious.” 


A cytotoxicity assay generated using HepG2 human hepatocytes.

Vendors Roll Out Image-Analysis Solutions

Versatility across application areas, from microscope-based imaging for detecting intracellular phenomena to high-speed scans at the cellular level to whole organism screening, is the focal point of instrument development at MDS Analytical Technologies. “With the options in our Complete Solution and the right infrastructure, you can use image-based assays for primary screening. We have tackled all the common bottlenecks,” said Michael Sjaastad, Ph.D., director of marketing for cellular imaging at MDS.

The IsoCyte® DL laser-scanning cytometer complements the company’s ImageXpress® instrument platform as part of its overall HCS solution. MDS offers a high-throughput option that can screen and do image analysis on a 1,536-well plate in two to five minutes, according to Dr. Sjaastad. The instrument can image whole wells for accurate cell counting in cell-viability measurements, scan a microscope slide, or produce and analyze images of organisms such as zebrafish or worms when used in conjunction with the MetaXpress image-analysis software.

For now, current systems “have the image resolution and acquisition speed researchers need,” and in Dr. Sjaastad’s view, future improvements will focus on “streamlining the data-analysis workflow and bringing the costs down per data point.”

In a workshop at the meeting, Oliver Leven, Ph.D., head of screener professional services at Genedata, identified several ongoing challenges in HCS, including managing the volume and complexity of the data, improving the efficiency of data analysis, and creating an audit trail of results interpretation. As the throughput and scale of HCS increases, so too, do the difficulty and scope of these challenges.

As researchers scale up an assay for high-throughput HCS, they need to select a defined set of parameters that represent the phenotype of interest and that allow them to assess the quality of both the assay and the data output. They also need to identify threshold values above or below which a result signifies a change in phenotype.

The typical HCS image-analysis software that drives HCS systems routinely quantifies the cell images to yield a numerical description of the phenotypes. For large experiments, however, Dr. Leven described the researcher’s need to go back and view an image associated with an interesting or suspicious measurement as a persistent bottleneck.

“The image is the experiment,” said Dr. Leven. A hit should signify a change in the cells, but it could also be an anecdotal finding or the result of an image out of focus. Distinguishing true hits from false positive results remains a challenge.

Dr. Leven recounted the HCS projects  that Genedata has performed for its pharma customers emphasizing the ability of the company’s High Content Analyzer—a new addition to the Genedata Screener® enterprise solution—to retrieve immediately any desired image. The high-throughput HCS projects described by Dr. Leven were able to analyze 40,000 compounds on a daily basis, for a total campaign of more than two million compounds, generating multifeatured data sets for each well.

PerkinElmer’s high-content screening portfolio includes the Opera confocal microplate image reader and Acapella™ image-analysis software, the compact Operetta HCS system, driven by Harmony™ software, and the Columbus™ data-management system and new Columbus 2.0 for use with the Opera platform.

Gabriele Gradl, Ph.D., global product leader for HCS at PerkinElmer Cellular Technologies, emphasized the complexity involved in deriving robust, quantitative data from cellular measurements derived on image analysis of high-content screens. Whereas, fluorescence-based analysis typically relies on identifying objects in cells and measuring their fluorescence intensities, PerkinElmer has developed a computational strategy that is independent of absolute fluorescence intensity. It relies on texture analysis and quantitative pattern analysis for data generation.

Texture-analysis tools can detect patterns and effects that would not be apparent on routine visual analysis, according to Dr. Gradl. Threshold adjacency statistics is one example of such a tool. It searches for differences in fluorescence intensity values between adjacent pixels over a defined distance. Dr. Gradl described the particular advantages of applying texture analysis for detecting subtle morphologic changes associated with cell viability or toxicity assays and in stem cell research. It can detect differences not visible to the eye and identify changes that the user might not even have known to look for in the data. She presented, as an example, the use of texture analysis to assess mitochondrial integrity, as loss of mitochondrial activity and enhanced mitochondrial biogenesis are early markers of cytotoxicity.

Dr. Gradl also described the use of texture analysis in brightfield imaging and the ability to assess segmentation based on granularity, enabling label-free proliferation assays and analysis of cell differentiation in real time.

The algorithms developed by PerkinElmer can apply texture analysis to whole cells or to specific intracellular compartments depending on the design of the assay. The company is exploring a range of applications for its texture-analysis software tools, including stem cell differentiation analysis, quality control of stem cells produced for therapeutic use, live-cell imaging over time, and 3-D tissue sample analysis.

Earlier this year, GE Healthcare introduced the IN Cell Analyzer 2000 cell-imaging system, which incorporates several new features: preview scoring of a selected area of a sample before an acquisition run; a large chip CCD camera coupled with a widefield illumination source for twice the brightness of a conventional xenon lamp, according to GE; whole-well imaging; an objectives range from 2x–100x; six imaging restoration modes; and a manual microscope mode.

Fred Koller, Ph.D., president and CEO of Cyntellect, launched the company’s new Celigo™ cytometer at the “High Content East” meeting, emphasizing the system’s ability to image “every cell in every well,”  from edge to edge without edge effects using both brightfield and fluorescence imaging. Cyntellect’s optical technology achieves high-quality large field imaging using a set of mirrors to capture each well in its entirety without moving the plate and without the need to refocus, allowing for rapid, full-plate imaging.

Celigo provides “uniform illumination with no gradient across the well,” said Dr. Koller, and allows for a combination of label-free imaging and three-color fluorescence. He described the instrument’s capabilities for performing cell-counting assays, cell growth tracking, and confluency studies, for example, and for noninvasive imaging of stem cell cultures without disrupting their three-dimensional colony structures. Celigo can switch from single-cell to colony-counting mode.

The company has also developed a secretion assay for use on the Celigo that measures the amount of protein secreted by individual cells. The assay can distinguish between high and low secretors and is useful for detecting heterogeneity and instability in cell cultures early in process development, such as for antibody manufacturing.

The Cellular Imaging and Analysis group at Thermo Fisher Scientific introduced the Cellomics iDev™ intelligent assay development workflow for HCS image analysis at “High Content East”. Users work training image sets of positive and negative biology, applying imaging and analytical algorithms that allow for real-time interaction with the images. The software employs the biological data generated to optimize assay protocols.

Previous articleRoche Gains Access to Cerep’s Pharmacology and ADME Database
Next articleClemson University Obtains $9.3M for Tissue Regeneration Center