Perhaps the most important image acquisition methods in HCS relate to cellular imaging, including drug effect assays, cytotoxicity, apoptosis, cell proliferation, and nucleocytoplasmic transport.
"With imaging one can simultaneously capture multiple measurements from individual cells including molecular colocalization, metabolic state, motility, cell cycle, texture, and cell morphology" says Judy Masucci, Ph.D., director of marketing and sales support at Cellomics (Pittsburgh, PA).
According to Dr. Masucci, no other single sensor modality can provide a comparable depth of information. Next-generation image acquisition instruments feature multispectral imaging, permanent, accurate alignment, and integration with downstream image processing and analysis software packages.
"HCS brings the mature field of cell biology to our assays, and imaging technology into the drug discovery environment," says Sjaastad. "Image-based screening is maturing quickly enough that primary screens can now run with imagers."
Advances in image acquisition have made certain types of screens more routine, including phenotypic and morphological. Phenotypic assays measure the ability of a virus, such as HIV, to replicate in the presence of a specific drug, providing a direct measurement of drug susceptibility.
A talk at the CHI meeting by Berta Strulovici, executive director, automated biotechnology at Merck (Whitehouse Station, NJ), will focus on two case studies of phenotypic assays that highlight the utility of these assays in a high content screening context.
According to Neville, another speaker at the forthcoming conference, two major components make up the IT infrastructure in high content screening: computation and storage.
"The most common technical logistical challenge that IT architects face is keeping track of the millions of files that high content instruments create, keeping them safe, backed up, and stored on the most cost efficient media," he says.
The numbers involved are daunting. "A typical screen can generate over 10,000 high-resolution images a dayhundreds of gigabytes of data," Sjaastad points out.
On the computational side, many techniques require complex algorithms to be run over these large amounts of data emerging from the instrument, and selection of a computational platform requires consideration of variables such as CPU selection and memory.
Neville points out that federal regulations often stipulate the length of time that records must be kept in storage. Another issue is access to the datahow much should be kept in long-term vs. short-term storage. Some data types, such as mass spectrometry, typically move into long term storage on tape in 23 years.
Conversely, says Neville, results files created from the analysis of a raw file or an image may be kept on high speed disk for a longer period of time, since this is what the investigator typically works with.
A central consideration in developing any HCS strategy is therefore defining an information life cycledefining what data goes where and for how long. Once this has been established, scalable strategies for archiving and backup can be defined.
In addition to the physical problems associated with data storage, is the question of the appropriate software for managing and querying the data. Perhaps the best known model in this context is the Laboratory Information Management System, or LIMS. These custom software frameworks allow for metadata to be associated with specific datasets using consistent annotation.
"This allows you to group your data logically and perform queries on it, i.e., show me all the files in our systems associated with trials of a specific drug,'" says Neville. He points out a third issue, data exchanges, which is often a problem for larger companies requiring solutions for moving data between different data storage centers.
"This usually entails the integration of various data formats and a web portal on the front end, allowing researchers in different organizational units to share data through a common interface," he points out.