GEN Exclusives

More »

Feature Articles

More »
Nov 15, 2011 (Vol. 31, No. 20)

Flow Cytometry Shifting Gears

Technology Shedding Its Slow, Cumbersome Reputation and Entering High-Throughput Realm

  • Click Image To Enlarge +
    Researchers at Purdue University have developed a new sample-processing approach. Shown here are dose-response curves in 3-D mode for a 384-well plate with 32 drugs and 10 point dose-response curves. On the bottom right (col 23, 24) are the positive controls. Rows 2 and 13 are the highest concentration of drugs. This representation is a method for visualizing the entire 384-well plate for any of the collected parameters simultaneously. This data was collected on a CyAn flow cytometer.

    Although established more than 30 years ago, the field of flow cytometry continues to grow and advance. It remains an indispensable tool for clinicians and researchers. The “Northwest Regional Cytometry” meeting held earlier this year focused on cytometry data: mining, modeling, and management.

    A number of presentations focused on the two big players, flow and image cytometry that analyze the composition of several components in individual cells utilizing labeled antibodies. New technologies include using acoustics to more accurately align cells, robotics for higher throughput, and supercomputing for rapid desktop analysis. A newer kid on the block is chemical cytometry that employs a suite of analytical tools to characterize a large number of cellular components.

    Traditionally, flow cytometry has not been considered a major player for high-throughput drug screening. But there is a paradigm shift afoot, according to J. Paul Robinson, Ph.D., professor of cytomics and professor of biomedical engineering, School of Veterinary Medicine, Purdue University.

    Dr. Robinson spoke about flow cytometry and high-throughput screening. “Two outdated ideas in the field of drug discovery are that high-throughput screening is all about imaging and that flow cytometry is a slow, cumbersome technology. This is no longer valid.”

    Dr. Robinson and colleagues have developed a new format in which all samples from a complex assay are collected in a single file and deconvolved in a time series for sample separation and analysis.

    “The fundamental change is that, using high-throughput flow in which thousands of samples can be run quickly, opens the opportunity to a systems approach to assay design. In the past, sampling was done with single tubes. Now we can use 384-well plates to sample and analyze.”

    The key breakthrough is the use of robotics. “These were developed by Larry Sklar to add cells, buffers, and dyes. Automation provides lower cost, better quality control, and faster results. Further, classification tools allow us to enhance the automation process and provide rapid useful results.”

    Data analysis is also important. “We are working on this bottleneck and have spawned some solutions to merge high-throughput analysis and high-content screening. Solving this problem will be transformational. A key step is to improve and optimize assay design. We now can utilize 1 microliter volume of cells and from that get many parameters to assess multiple populations of cells, do time courses, dose responses, etc. Instead of doing 25 samples in an hour, with robotics we can do 10,000 samples easily in one afternoon.”

    Dr. Robinson said scientists need to think differently. “We should be thinking systems biology and flow cytometry, since we want to simultaneously obtain information on multiple pathways and cell operations. Basically, we want instantaneous gratification of data analysis. The good news is that this is now becoming a reality.

    “One of the leaders in the systems approach is Garry Nolan who has developed advanced barcoding tools allowing dozens of complex pathways to be studied simultaneously. Nolan’s developments have been transformational in increasing sample multiplexing and complexity.

    “Data analysis is again one of the limiting issues faced by scientists. Analyzing 15 to 50 simultaneous parameters is not something that current approaches can handle. The increase in complexity and sampling rates is defining a new application space for informatic, high-content flow cytometry.”

    Garry Nolan, Ph.D., professor, department of microbiology and immunology at Stanford, not surpringly, agreed. “Currently, lab automation is coming along for academics, but it has to be scaled up to become effective. Obviously, we’ve been able to benefit from the huge investment pharma has made over the years and the innovation that has come from supporting the pharma drug screening market. But I would say that for most purposes, it’s still being patched together with duct tape and good programming skills in labs.

    “Perhaps the main reason it has not moved out into main use is that a workflow that everyone will use has not been finalized upon as mutually needed by many labs. Thus, the automation still requires maximum flexibility—meaning only a few labs can afford it.

    “We think we have identified certain key needs in the staining and preparation of whole blood or cell lines that are amenable to cost-effective, broadly usable automation. We’ll see in the next few months whether that comes to pass.”

  • Desktop Supercomputing

    Click Image To Enlarge +
    Beckman Coulter’s Kaluza flow cytometry software can process multiparametric fluorescent data of millions of cells in real time, not only for display in n-dimensions but also to calculate multiple statistics simultaneously.

    Supercomputing technology is now being applied to process the large amounts of data generated in flow cytometry. “The field of flow cytometry is challenged by two major problems: massive data generation and real-time analysis of the data,” reported Bob Zigon, senior staff R&D engineer flow cytometry at Beckman Coulter.

    “In about a 13-year period, one pharmaceutical company estimated it had 60 billion parameter/events and 250 gigabytes of flow data. This is a real data-mining challenge.”

    According to Zigon, the solution is a compact board that Beckman recommends as an option for its Kaluza flow cytometry software. “The NVIDIA Tesla C2050 general purpose processing unit has 3 gigabytes of RAM, 448 processors (as compared to 1–8 on a typical desktop computer), and performs one trillion arithmetic operations per second. This allows Kaluza to process up to 10 million events in real time and offers an analytical speed that is several hundred times faster than other commercially available cytometry software.”

    In flow cytometry, a user typically employs complex gating techniques to identify subsets of cells. When these subsets are identified, the user then wants statistics such as the mean, median, and standard deviation computed on those sets. The Tesla-enabled version of Kaluza allows for these sets and statistics to be computed in real time. When the feedback from moving the mouse occurs instantaneously, the user is able to quickly explore complicated what-if scenarios without breaks. The net effect is a faster, more reliable means to generate and understand data in real time.


Add a comment

  • You must be signed in to perform this action.
    Click here to Login or Register for free.
    You will be taken back to your selected item after Login/Registration.

Related content

Jobs

GEN Jobs powered by HireLifeScience.com connects you directly to employers in pharma, biotech, and the life sciences. View 40 to 50 fresh job postings daily or search for employment opportunities including those in R&D, clinical research, QA/QC, biomanufacturing, and regulatory affairs.
 Searching...
More »

GEN Poll

More » Poll Results »

Alzheimer's Therapies

Do you think an effective treatment for Alzheimer’s will be found within the next 10–15 years?