February 1, 2014 (Vol. 34, No. 3)

Kathy Liszewski

The ever-expanding field of high-content analysis (HCA) continues to lure scientists from many disciplines. Because of its versatility, high-content screening can be used for many applications including basic research, target identification, primary screening, and predicting clinical outcomes. CHI’s upcoming “High-Content Analysis” meeting will include presentations on monitoring cellular changes in real time, conducting multiplexing assays via biological bar-coding, and measuring neurodegenerative changes resulting from the failure to clear biological debris.

High-content analysis combines the strength of three proven tools, notes Scott Keefer, manager of product marketing for Thermo Fisher Scientific’s cellular imaging and analysis products. “The first is fluorescent microscopy that not only provides flexibility for reagent selection but also the ability to record and then measure cellular changes at different resolution levels. A second important tool is use of a plate reader. This easy-to-use automation allows high-capacity quantitation and keeps true to a multiwell or multisample format. The third tool harnesses the power of flow cytometry for multiparameter assessment of individual cell populations,” Keefer continues.

“What is unique about a dedicated technology for high content rather than, for example, building a plate reader into a high-content machine, is the breadth and depth of applications that can be addressed. In other types of platforms, users have much less flexibility.”

High-content screening follows a straightforward four-step process. First, cells are plated and treated with a compound of interest or siRNA designed to elicit a response. Second, cells are stained with fluorescent dyes that target desired subcellular organelles or proteins. Third, cells are imaged to extract data from cells and wells. Finally, data is mined to characterize phenotypes.

High-content analysis is being utilized with increasing frequency in mainstream research, notes Keefer. “We are finding more use across the scale of complexity every day. Over 1,000 peer-reviewed publications through 2012, covering a breadth of studies such as neuronal regeneration, in vitro toxicology, proliferation, migration, and signaling, have been published with the variety of applications growing daily.”

Pathologic angiogenesis is a hallmark of cancer and other inflammatory diseases. Ephrin receptor tyrosine kinase A2 (EphA2) has been identified as a potential target for inhibiting angiogenesis in malignancies. “We wanted to establish a high-throughput assay for screening leads that inhibit EphA2 function,” says Susanne Heynen-Genel, Ph.D., director, high-content screening systems, Conrad Prebys Center for Chemical Genomics, Sanford-Burnham Medical Research Institute.

She, along with colleague Elena Pasquale, Ph.D., a professor at Stanford-Burnham’s NCI-designated cancer center and a participant in the institute’s tumor initiation and maintenance program, cancer center at the same institution, developed a phenotypic approach for high-content screening that was based on cell retraction/contraction in response to activation of EphA2 by binding of a recombinant form of its ligand, ephrin-A1 Fc.

According to Dr. Heynen-Genel, “The assay we developed used a prostate cancer cell line, PC3, engineered with a membrane-targeted green-fluorescent protein (GFP). Cells were first adhered to microtiter wells and then preincubated with a compound library followed by addition of the ephrin ligand. We quantified cell retraction at different time points using the GFP fluorescence to outline the cell shapes.”

Dr. Heynen-Genel says they first developed the assay in lower throughput and then transitioned it into the high-content arena. This was not without its challenges. “When extending the assay to higher throughput in 384-well plates, we found that a number of conditions needed to be more exactly optimized, such as concentration of ligand, temperature of incubation, and timing of maximum retraction. We also needed to better define extent of cell retraction. Our solution was to examine six or seven parameters instead of one. The multiparametric approach worked, but required more complex analysis.”

Dr. Pasquale notes that these assays may provide more information than just comparison of lead compound activity. “EphA2 is widely expressed in tumor cell vasculature, but less so in normal cells. It isn’t clear how it works in cancer cells as related to migration, invasiveness, proliferation, and signaling pathways. These screens may reveal new mechanisms for how the receptor acts and suggest new approaches for targeting malignancies.”


High-content analysis (HCA) combines digital microscopy and flow cytometry. HCA can outperform either technology in many applications, and the integration of data-mining capabilities makes HCA even more powerful.

Bar-Coding for Imaging

One of the major impediments to improving cell-screening capabilities centers on improving biological significance and clinical relevance. One way to enhance that process is to screen several biologically relevant assays in parallel, according to Fred Schaufele, Ph.D., associate professor, Center for Reproductive Sciences, University of California San Francisco.

“We can generate much more biologically relevant assays using a combination of different assays, that is, by multiplexing. High-throughput, multichannel fluorescence microscopy allows quantification of multiple specific cellular subcompartments.”

One problem with that approach, however, is that multiple fluorophores often overlap in their excitation and emission properties. “We developed a cellular bar-code technology of assay-specific markers consisting of distinct combinations of different ‘red’ fluorescent proteins coupled to a nuclear localization signal that combines multiple cell-based assays into one well. The bar-code markers are excited by a common wavelength of light, then distinguished due to their different relative fluorescence in two emission channels. This allows us to link different reporters to different bar-code-marked cells and to realize multiple reporter outputs from a single well in a drug screening campaign.”

“A small start-up, XCellAssay, has been created with the goal of realizing up to 56 different marker/reporter combinations, although we have, to date, established the methodology with just three reporters.”

Dr. Schaufele’s laboratory studies the androgen receptor and prostate cancer. Most prostate tumors are dependent upon androgens for survival and proliferation. Thus, the androgen receptor is a prime target for intervention, particularly the lowering of androgens by competitive inhibition. “The goal of treatment is to selectively modulate the androgen receptor by activating or inhibiting only certain pathways. The bar code enables us to search for compound leads that fall into different activity classes, based upon their activity profiles at distinct androgen receptor functions.”

According to Dr. Schaufele, with the ever-increasing costs of drug development, the ability to identify a highly specific drug candidate across multiple assays can greatly reduce the cost of post-screening therapeutic workups while enabling the rapid definition of drug leads likely to have the least disruptive off-target effects. The multiplexing abilities also “bin” hits into similar or different activity profiles across the multiple assay readouts to define distinct functional classes of hits. Dr. Schaufele suspects this ability will minimize the waste of time, effort, and money that occurs when similar leads are unknowingly worked up in costly studies when other, potentially more relevant leads in other activity classes could have been examined.


UCSF researchers developed a bar-coded nuclear reporter. Two cell lines, each expressing a different yellow fluorescent protein (FP)-linked reporter, are co-cultured. Cells in the mixture are identified by automated image analysis programs that define each cell nucleus marked with red FPs. Via different red FPs, the cell lines are distinguished by their characteristic ratios of red fluorescence in two different emission channels. Then the amounts of yellow fluorescence are assigned to the specific reporter associated with each bar-coded cell. [Source: Krylova et al. (2013) PloS ONE 8(5): e63286.]

Measuring Autophagy

Autophagy is a degradation mechanism in which cytoplasmic components, such as proteins or organelles, are delivered to lysosomes via double-membraned organelles termed autophagosomes. This dynamic process is called autophagic flux. Recently, upregulation of autophagic flux has been eyed as a therapeutic strategy for neurodegenerative diseases such as Alzheimer’s disease, Huntington’s disease, and Parkinson’s disease. Scientists at Amgen are examining this process by seeking to identify genes involved in autophagy.

“A hallmark of these diseases is failure to clear protein debris,” notes Christopher Hale, Ph.D., scientist, discovery technologies, Amgen. “The main question is what assay will show an overall increase in flux in this multistage process? Often assays look only at one stage of the process. We sought to identify genes involved in autophagy by performing a live-cell, image-centered, high-content siRNA screen. We used U-2-OS cells that were stably transfected with a single fusion construct made up of two fluorescent proteins, one green (GFP) and one red (RFP) linked to LC3, microtubule-associated protein 1 light chain 3. LC3 is a widely used marker of the autophagocytic process that undergoes post-translational modification by a ubiquitination-like reaction.”

Dr. Hale’s team imaged cells for up to 96 hours, necessitating use of an instrument that would also keep cells in a closed, cell-friendly environment. “RFP and GFP have different pKa values and will quench at different pH levels. Once an autophagosome bearing RFP-GFP-LC3 has fused with an acidic lysosome, the GFP signal is quenched due to its instability at low pH. This leaves a GFP-negative, RFP-positive autolysosome and allows us to monitor the progression of autophagy.”

To analyze about 100 genes in this profiling assay, Dr. Hale developed analysis algorithms for visual feedback to quantitate autophagosomes and autolysosomes per cell over time. “We identified genes that both increased and decreased autophagic flux. We found that it was critical to use several pathway controls, including mTOR, which maintained elevated counts of autolysosomes per cell over time. From our studies, we found a handful of genes that we can now pursue therapeutically.”

Probing the Estrogen Receptor

The estrogen receptor (ER) is a predominantly nuclear resident transcription factor that is activated by estrogen binding, which leads to ER targeting a specific DNA sequence to regulate the activity of different genes. It consists of two forms, alpha (ERα) and beta (ERβ). Michael A. Mancini, Ph.D., professor, department of molecular and cellular biology, Baylor College of Medicine (BCM), is investigating the mechanism of ER-mediated transcriptional regulation, including classifying the effects of endocrine disrupting compounds (EDC) and native ligands.

“We developed a biosensor cell line called GFP-ERα:PRL-HeLa array. These cells stably express green-fluorescent protein tagged to ERα and also harbor a multicopy integration of an estrogen-responsive transcriptional reporter gene based upon the rat prolactin gene (RPL). We utilize this model in 384-well high-throughput microscopy-based assays that screen transcriptional co-regulator siRNA libraries. The cell model and ‘high-content’ imaging platform we developed facilitates rapid and quantitative measurements involved in ER transcription, simultaneously, and has identified novel ER co-regulators that are required for activity.”

Using the high-throughput approach they developed over the last decade, Dr. Mancini’s laboratory has classified a number of compounds that regulate ER across multiple mechanistic steps. “Using a panel of bisphenol A analogs in our screening, we identified both estrogenic and anti-estrogenic effectors on both ERα and ERβ. The dataset can be huge, with tens of thousands of images being acquired from each plate, sometimes taking several hours.”

The group followed up on leads using biochemical assays, fluorescent in situ hybridization, and PCR. “All these studies have required a wide range of state-of-the-art imaging and software resources that are housed in the Integrated Microscopy Core, a member of the Advanced Technology Core network at BCM. Collectively, our studies have identified ER modulators; provided a better mechanistic classification of ER ligands, co-regulators, and EDCs; and have suggested new druggable targets. We hope investigations such as these will ultimately lead us closer to more predictive personalized medicine.”

High-Content Analysis Celebrates Its Sweet 16!

Scott Keefer

This year, high-content analysis celebrates its “Sweet 16,” marking the introduction of the first commercial platform to quantitatively image and analyze cells to investigate more closely phenotypic change on a single-cell level. Much like a moody teenager, the technology is still trying to find a foothold in a world of high-end microscopy and lower-end cell counters, both of which high-content analysis technology can outduel depending on the application.

With thousands of users of the software, hardware, and informatics tools that constitute this technology, along with new vendors entering this hotly contested market, high-content analysis is gaining traction and quickly becoming a standard tool in the lab. High-content analysis is not only changing how basic research is done through increasing the scale of cell biology research, it is also becoming integrated into all levels of throughput for drug discovery and safety revolutionizing quantitative neuroscience at the cell level, and accelerating applications in cancer research.

I remember a conversation with a colleague about trying to figure out if there was a “killer application” associated with high-content analysis. Was is looking at neurite outgrowth? Translocation events? Stem cell differentiation assays? After much argument, we agreed that the technology of high-content analysis itself was the “killer” application, fitting the niche of rapid and sensitive quantitative cell analysis across a broad application portfolio in one bundled platform.

It is an important perspective that high-content analysis is not one thing; it is neither a modified microscope nor a more capable plate reader. As the technology enters its 17th year, high-content analysis is a balanced blend of the right image, know-how, and a bit of controlled chaos to get the most out of cellular imaging and analysis. So as high-content analysis sprints toward adulthood, the evolution will continue in how to “leverage the cloud” or “create standards.”

The ideas that will lead to these advancements will have been built off the backs of experts and all types of users around the world. Speaking from well beyond the teenage years, I personally look forward to the day when high-content technology becomes an even more intelligent lab tool that accelerates cell research. I think the hope of discoveries and cures for diseases tomorrow will certainly depend on the advancement of this technology today.

Scott Keefer ([email protected]) is manager, product marketing, Thermo Fisher Scientific, Cellular Imaging and Analysis.


A researcher loads a plate into the Thermo Scientific™ CellInsight™ NXT HCS Platform, a high-content screening tool for the automatic imaging and analysis of cells. It can provide real-time quantitation of cells for basic and advanced research applications.

Previous articleCurrencies, Generics Dampen Novartis Q4 Profits
Next articleBiogen Idec Aims to Increase Asian Footprint with UCB Partnership