Maximizing HRM Assay Performance
“HRM—high-resolution melting—is an emerging technology that’s a natural outgrowth of the PCR arena,” stated David Schuster, Ph.D., director of R&D at Quanta Biosciences. The company is developing reagents to enhance this assay for genetic variation analysis and gene quantification. Dr. Schuster added that the key to transition from qPCR to HRM is the dyes.
“In theory, you want to use a dye that’s occupying every site, and as you increase the temperature and DNA is dissociating more and more, the dye no longer binds because it’s single stranded. So you have a direct decrease in fluorescence as the DNA is melting. The promise of HRM is to be able to detect subtle differences in sequence variation. So, if you have a well-behaved system, you can readily resolve even silent base changes (transversions)—like C-G SNPs, you can actually pick those up.”
The Accu-Melt HRM SuperMix reagent has been engineered to maximize the melting differences seen from different sequence variants. It should be able to be used on any HRM assay, but the company is currently focused on SNP assays. Other potential applications include use for low-resolution assays to identify gene knockouts. Traditionally this would be done via PCR with three primers. However, if done via HRM, it would enable the ability to distinguish subtle sequence variations and eliminate the gel.
The firm said the new reagent should provide considerable time-savings due to much less required optimization. Dr. Schuster added that it should work for methylation analysis as well, although it hasn’t been validated yet. “We’ll be looking to see what the challenges are with it and provide unique solutions as they arise. Since there are so many diverse applications of HRM, I don’t think there’s going to be one formulation for universal application. That diversity brings a lot of complexity.”
Although many instrument and reagent companies are starting to support HRM, Dr. Schuster said it still has some big challenges. One is finer temperature control. The difference in melting temperature between an A-T SNP may be 0.2º, but most machines only detect down to 0.5º. There are algorithms to try and normalize for this in a process called temperature shifting, where normalization is attempted against the x-axis. However, he cautioned, there is a need for better methods of standardizing algorithms for data.
HT Expression Profiling
The bottlenecks and challenges of qPCR no longer reside in the technology, but rather what occurs up- and downstream, explained Mikael Kubista, Ph.D., founder of TATAA Biocenter. “The upstream, or preanalytical phase, is a major challenge, especially for gene-expression analysis,” he said.
This is due to several factors. Most sample-collection tubes contain EDTA, which removes magnesium, and greatly affects expression—some genes are 20-fold up-regulated. Tissue samples often have degraded mRNA due to nucleases released upon collection. “I read somewhere that about 85 percent of samples being analyzed from tissue are formalin-fixed embedded tissues,” said Dr. Kubista. “This material is often quite damaged and questionable in terms of expression-profiling analysis.”
Since biological material is heterogeneous, it’s difficult to measure a particular disease signature above the variable background. This has lead to HT single cell expression profiling. HT is needed because expression occurs in bursts. “If you take a snapshot of the RNA levels in an individual cell, they vary. In order to obtain statistics, you have to measure at least 50 cells to get a signature. We know the most powerful signatures are not only gene-expression profiles, but also correlation between genes,” Dr. Kubista explained.
This correlation is key to the downstream process, or data analysis and mining. Although most researchers analyze one gene at a time, this is not very powerful due to high background noise. “However, if you monitor five to ten markers, you can take advantage of a correlation of gene expression. This will be important in the future for not only diagnostics, but to prognostics and prediction of response to therapy.”
Dr. Kubista also said there will be a trend toward combining information from more measurements to include all the markers involved; this is known as multimodal analysis. “All the tools are there, but nobody is putting everything together. This has been recognized as an important field in Europe, but what is missing is a simplified work flow.”
He added that there is still too much user intervention introducing large errors and that traceability is not as good as it used to be. However, he says more people are making an experimental design first and performing replicates at all steps of the experimental procedure. “We are learning how to optimize the experiment by finding out where the variability comes from and then discovering how many subjects are required to measure what is needed, like two-fold expression.”
This is called power-testing and is key, said Dr. Kubista, because there are many studies with either too few (unable to detect differences) or too many (wasting money) subjects. Since this approach requires HT, Dr. Kubista says the field will be moving toward more outsourcing and researchers will be more focused on planning experiments and collecting samples.