PerkinElmer (www.perkinelmer.com) plans to enter the HCS-provider arena with a suite of offerings. Their entry will be enabled by their acquisition of Evotec Technologies (www.evotec-technologies.com). Evotec currently provides systems for confocal imaging, cell handling, ultra-high throughput screening (uHTS), as well as image capture and cellular analysis software.
“PerkinElmer already offers a variety of cellular science platforms such as the CellLux (an automated cellular fluorescence imaging platform), LumiLux™ (automated cellular luminescence imaging platform), and assay chemistries. These plate readers are complementary with Opera™ and Acapella™, a confocal imaging system and sophisticated imaging software, respectively, both of which will be acquired through Evotec Technologies. The combination of these systems significantly boosts the number of applications that PerkinElmer can serve in cellular screening,” says Mary Duseau, business unit leader for detection and analysis systems at PerkinElmer.
Other technologies of interest are the microchannel flowthrough device, which is essentially a cell processor with dielectric elements. It allows individual cells to flow through the system and be imaged. This device also enables localization studies in live cells.
“Customers are looking to add cellular capability to their existing HTS setup. We will be able to offer the EVOscreen® uHTS system, which has the ability to integrate multiple platforms and is amenable to HCS,” adds Duseau.
HCS can provide a lot of useful information if one can effectively sift through and analyze the enormous amount of data generated from such screens. Charles Y. Tao, Ph.D., laboratory head of genome and proteome sciences at Novartis Institutes for BioMedical Research (www.novartis.com), describes data analysis strategies for high-content screening. “High-content screens result in a huge amount of data. For example, a whole- genome (comprised of about 25,000 genes) screen with a simple reporter gene assay would typically result in 50,000 data points per screen. These data points contain only the intensity information.
“However, a whole-genome screen with the HCS approach could potentially result in billions of data points per screen after image quantification. The data overload is often a bottleneck, and it is critical to have effective data-analysis strategies in place,” comments Dr. Tao.
He developed a data-analysis platform at Novartis to address this problem. The goal is to mine HCS data for information relevant to drug discovery using various statistical and bioinformatic approaches and to ensure the quality of data.
“We perform data analysis at every stage of our HCS campaign. The first step is quality control and data normalization based on preselected parameters. This is an often ignored aspect of data analysis. Errors are detected and systematic errors are corrected, if possible.
“At the second step, we apply various statistical methods to analyze the data. For example, in a cell-cycle study, machine-learning approaches are taken to classify the millions of cells into the different phases of the cell cycle in order to generate a cell-cycle profile for each gene or compound, while in a nuclear translocation study, dose-response profiles are generated. Finally, bioinformatic methods, such as pathway analysis, are applied to further prioritize the hits,” explains Dr. Tao.
As a case-study example, Dr. Tao will discuss a whole-genome HCS. With the data-analysis platform, “we were able to identify both known and new cell-cycle regulators,” says Dr. Tao.
Kenny Guo, research scientist in the applied genomics department at the Bristol-Myers Squibb Pharmaceutical Research Institute (www.bms.com), will discuss strategies to effectively streamline and analyze the complex and high-volume data that is generated by HCS. Bristol-Myers Squibb conducts high-content screening for target validation, compound-library screening, and determination of compound mechanism of action.
“Our high-content screens are run using a high-resolution, fluorescent-based imaging system. Cells are labeled and assayed for multiple endpoints. The output combines numeric and image-based data. For example, if we run a four-channel assay with three primary measurements per channel and 1,000 cells per well, the resultant twelve combinations generate 12,000 data points from a single well. When you add in the image data at ten images per well (1,000 images per plate), the final output can easily reach a million data points from a single 96-well plate,” says Guo.
“To deal with this type of high-volume data, we use an integrated system that is based on a company-wide database. This database is built upon scanned data and results data. It starts with a hierarchical storage system for images and a relational database for numeric results. This data is linked to a data-management system.
“Data annotation and data QC are then conducted in an automated fashion within the data-management system. Customized tools are used to rank and sort the potential hits. This data is then published for company-wide use. The data-management system is also integrated into data-visualization and -analysis software tools to maximize efficiency and productivity,” explains Guo.
An example apoptosis study, conducted with multiple markers to delineate the molecular mechanisms of compound cytotoxicity, will be presented at the conference.
Also scheduled to talk at the Cambridge Healthtech Conference, Steven Suchyta, Ph.D., technical specialist at Ambion (www.ambion.com), an Applied Biosystems (www.appliedbiosystems.com) business, says he will describe how the R&D group used targeted siRNAs to alter survivin expression. A series of biochemical and cell-based assays were performed on the transfected cells to assess known indicators of apoptosis to better understand the role of survivin in cancer.