HT Expression Profiling
The bottlenecks and challenges of qPCR no longer reside in the technology, but rather what occurs up- and downstream, explained Mikael Kubista, Ph.D., founder of TATAA Biocenter. “The upstream, or preanalytical phase, is a major challenge, especially for gene-expression analysis,” he said.
This is due to several factors. Most sample-collection tubes contain EDTA, which removes magnesium, and greatly affects expression—some genes are 20-fold up-regulated. Tissue samples often have degraded mRNA due to nucleases released upon collection. “I read somewhere that about 85 percent of samples being analyzed from tissue are formalin-fixed embedded tissues,” said Dr. Kubista. “This material is often quite damaged and questionable in terms of expression-profiling analysis.”
Since biological material is heterogeneous, it’s difficult to measure a particular disease signature above the variable background. This has lead to HT single cell expression profiling. HT is needed because expression occurs in bursts. “If you take a snapshot of the RNA levels in an individual cell, they vary. In order to obtain statistics, you have to measure at least 50 cells to get a signature. We know the most powerful signatures are not only gene-expression profiles, but also correlation between genes,” Dr. Kubista explained.
This correlation is key to the downstream process, or data analysis and mining. Although most researchers analyze one gene at a time, this is not very powerful due to high background noise. “However, if you monitor five to ten markers, you can take advantage of a correlation of gene expression. This will be important in the future for not only diagnostics, but to prognostics and prediction of response to therapy.”
Dr. Kubista also said there will be a trend toward combining information from more measurements to include all the markers involved; this is known as multimodal analysis. “All the tools are there, but nobody is putting everything together. This has been recognized as an important field in Europe, but what is missing is a simplified work flow.”
He added that there is still too much user intervention introducing large errors and that traceability is not as good as it used to be. However, he says more people are making an experimental design first and performing replicates at all steps of the experimental procedure. “We are learning how to optimize the experiment by finding out where the variability comes from and then discovering how many subjects are required to measure what is needed, like two-fold expression.”
This is called power-testing and is key, said Dr. Kubista, because there are many studies with either too few (unable to detect differences) or too many (wasting money) subjects. Since this approach requires HT, Dr. Kubista says the field will be moving toward more outsourcing and researchers will be more focused on planning experiments and collecting samples.