September 1, 2009 (Vol. 29, No. 15)

Past, Present, and Future of Enabling Technology in Academia and Industry

GEN recently sat down with thought leaders in the cellular imaging and analysis space to talk about current and emerging trends in the field. The three experts, all intimately familiar with imaging applications, talked about the use of high-content screening in industry and academia, overcoming bottlenecks, reagent gaps, and live-cell vs. fixed imaging.

James G. Evans, Ph.D., from Anon Consulting; Kelvin Lam, Ph.D., director of high-throughput screening at the Harvard Stem Cell Institute; and Achim von Leoprechting, Ph.D., vp and GM, cellular imaging and analysis solutions at PerkinElmer, were the panelists.

GEN:  How extensively is high-content imaging being used in drug discovery and academia?

Evans:  High-content screening (HCS) was developed in drug discovery. Six years ago, MIT and the Whitehead Institute partnered with Cellomics to bring high-content screening into an academic environment, particularly for systems biology applications. Right now, the two areas are pretty even. There’s a lot of high-content screening being done in both drug discovery and academia. RNAi screening is probably the predominant academic application, particularly at places like the Broad Institute.

Lam: High-content screening has traditionally been used as a secondary screen. However, it is moving toward the primary screening area due to advances in data workflow and data management.

von Leoprechting: Traditionally, high-content screening was used in drug discovery, mainly in secondary screening or for tox applications where it offered a particular benefit over existing methods. From there, academic screening centers started to use HCS with phenotypic assays for mechanistic research and drug screening. Today HCS has become a well-established tool for both cell-based drug discovery and academic research. At this time, the technology advancement is clearly led by academia and many new developments like higher-throughput technologies, better spectral-resolution techniques, improved image-analysis tools and reagents, are being developed in academia.

James G. Evans, Ph.D.

GEN  How has the emergence of phenotypic analysis impacted imaging?

von Leoprechting: In pharma, drug discovery phenotypic assays have not historically been used for lead identification, as this has not been widely accepted by medicinal chemists. Today, this is changing, driven by the obvious advantages of phenotypic screens, for example, in RNAi campaigns or for re-evaluation of established drugs for new applications and therapies. With the advances in image analysis and data management, the benefits of phenotypic over biochemical screening methods are becoming more and more important  in the design of drug discovery campaigns. 

Lam: I think phenotypic analysis when I think about high-content screening, because you are looking at the homogeneous populations within the heterogeneous population. So if you have 100 cells, 20 cells express a specific marker, therefore, you’re looking at one small population among the heterogeneous population. A phenotypic screen means that you’re dealing with a heterogeneous cell population, because in drug discovery we screen cell lines, which are a homogeneous population. In complex drug discovery paradigm, how would you screen a heterogeneous population? That’s where the phenotypic screen comes in.

Evans: Classically, biologists and engineers tend to have the idea that cell lines should be homogeneous, but typically, there’s a lot of variation cell to cell. Just based on cell stain and the microenvironment, especially when you’re making subtle and complex measurements such as phenotypic measurements, your analyses can be quite difficult to interpret due to noise from the variation from cell to cell. This has been quite off-putting to chemists who, in many cases, want a single number per compound.

The relatively high level of complexity in interpreting data from phenotypic screens is the major problem in getting their adoption into drug discovery. And part of that has been due to the lack of development of tools that really allow you to look at the variation within populations and compare treated and control populations in the way that a well-trained biologist or pathologist is able to do—filter out (in their heads) the noise.

The problem in translating from pathology to drug discovery is real quantitation and scalability that you don’t get in the pathology lab, it’s very much dependent on the individual who’s doing the screening. So, coming up with a software package that gives you the expert-analysis that is scalable and can adapt to different biologies is something that’s a real challenge for the software developers.  

Achim von Leoprechting, Ph.D.

GEN  Are there bottlenecks, and if so, where are they?

Lam: There are three bottlenecks: image acquisition, software to analyze the image data, and information management. These are the areas we need to improve upon, rather than just the hardware.

von Leoprechting: We are collaborating with the open-source networks to help break some of the bottlenecks in image analysis and storage. The point is not just managing and analyzing the data, but also making it more accessible to users. This is also where we see the difference between some of the pharma labs, especially in assay development, and academic labs.

In academia you often have many highly experienced experts who can run complex data-mining and data-analysis programs. But  in a drug discovery therapeutic group, such expertise is often not available for assay development, as this knowledge mainly resides in the screening labs. So you would need to develop two-tiered software packages—one for the experts and one to allow new users to set up assays easy and fast.

Evans: I definitely believe that information management has been a key aspect of being able to scale high-content screening. One of the problems that we’ve been facing for the last few years is how to actually take all that information and build knowledge, because that’s what you ultimately want. You want to be able to make informed decisions on which drugs to develop or which targets to follow up on.

A major part of the pipeline that is missing is a modeling environment where new information can be piped in as it is being generated and models refined.  

Kelvin Lam, Ph.D.

GEN  What are the gaps in cellular imaging reagents, and are label-free methods making an impact?

Evans: There is still a lot of room for improvement in the live-cell area. There are companies that are engineering cell lines with reporters or markers of interest that are expressed stably, so you don’t have to add antibodies or live-cell dyes, which helps a lot, as long as the cell line has been validated.

There is definitely room for improvement in the performance and variety of available live cell dyes. With image analysis, there’s some catching up to do to deal with that multiplicity of labels. So it’s sort of a chicken-and-egg situation where the multispectral aspects of the reagents haven’t been pushed because the analysis capability hasn’t been there to cope with it.

The live-cell dyes or the cells that are engineered with fluorescent proteins are  the biggest gap that I see. There’s also some development of multifunctional reporters for not just protein species, but also forces, things that will actually report traction by the cell and look at interactions between cells, with a matrix or with their neighbors, and actually have a readout. There are also calcium sensors, of course, which are the most prevalent nonprotein analyses that have been used primarily in industry.

von Leoprechting: Live-cell imaging will be mainly driven by label-free approaches and innovations in FRET-type reagents for live-cell imaging. The big problem with fluorescent probes is the phototoxicity over longer periods of exposure. This is where, from the reagent development side, new dyes that have higher quantum yields and improved linkers are of interest. For the same reason, spinning disk confocal systems already allow you to minimize the toxic impact of the light energy the cell is exposed to. In HCS our observation is that for live-cell assays label-free techniques like texture analysis as well as 3-D/4-D features become more important.

GEN  What percentage of your work is live cell vs. fixed?

Lam: Currently, live cells are used mainly by scientists looking in a microscope. As for using live cells in high-content screening, in our case it would be minimum. But we are planning to expand our stem cell research into the live-cell imaging area.

Evans: Based on my past experiences at MIT and the Whitehead, there has been a growing movement toward live-cell assays. But it’s definitely still 10 percent or less of the total assays. Typically, some of the success stories have been in things like wound-healing assays and stem-cell differentiation, where people have done live-cell imaging over long periods.

von Leoprechting: Many drug discovery companies would like to do more live-cell assays including the use of primary tissues, worms, or zebrafish for tox applications, but here the bottleneck is mainly in the upscaling and quality control. In contrast with the academic world in which researchers often look into kinetics over days and even longer, the drug discovery researchers are measuring kinetics predominantly by time points. Doing this in a standardized format for thousands of data points is something right now where we’re hitting the ceiling of what’s possible in terms of sample and data management.

There are technical solutions emerging that enable integrated live-cell friendly environments for HCS in pharma. This trend is coming from a strong desire to move to more relevant live-cell applications.

Evans: The allure of the kinetic analysis is that you can avoid doing those end-point kinetics, especially where you have a heterogeneous response, and especially where you’re doing phenotypic assays, it’s very hard to marry up those different time points and deal with what arises from the noise between those end points and see if you still have robust trends.

If you have a kinetic analysis of the same cells over time, a lot of that noise and variation in cell states is connected and can be removed so you can actually follow a cell that changes from healthy to sick to dead and actually measure the transition in states more easily. There is a trade–off, and I believe it depends on where in the pipeline the assay is and while there are throughput requirements, when you start building models it is nice to be able to actually measure the state change, and the rates of state change, within your population.

GEN  Which applications are currently utilizing live-cell imaging more than others? 

Evans: Developmental areas and neurobiology are prevalent in both academia and industry, especially for 3-D live cells.

Lam: Cell-cycle measurement and cell division are two applications that I believe are using live cells successfully at this time. Other basic research areas that have traditionally used live cells are certain developmental biology applications.

von Leoprechting: The use of live-cell imaging in infectious diseases and cancer research is growing fast. Specifically, the use of live-cell imaging in viral infection and invasion models or cancer metastasis experiments is increasing in importance.

Previous articleKiller Vaccines and Deadly Viruses on the Rampage!!!
Next articleAthaMap