Accurate determination of gene expression in qPCR is facilitated by assessing the so-called threshold cycle (Cq) at which the level of fluorescence signal exceeds the overall background. This is measured in the linear portion of the amplified curve. Thus, the Cq value is currently responsible for quantification in qPCR. However, there are often fundamental flaws in this analysis, according to Chaminda Salgado, head of PCR services at NDA Analytics.
“One has to ask if your Cq is really telling you the truth. Although Cq should be the cycle at which one can discriminate signal above noise, often the user defines this threshold. There are also issues surrounding alternative algorithms, designed to be nonsubjective. One of the problems is that the Cq is only a single data point and does not utilize all the available data from a real-time assay. This is a fundamental flaw.
“Another issue is that the baseline, often set universally across all reactions, is not appropriate for every reaction. These two combined issues can produce unreliable data that no amount of statistical evaluation or MIQE compliance can correct. I estimate, on most instruments, around 10 percent of the reactions are prone to such issues. The current Cq and baselining methods have not significantly evolved in over 14 years, since the first commercial real-time instruments were launched. This is due to vendor focus on addressing requirements of applications for which qPCRs are used.”
Salgado strongly cautions all qPCR users to be aware of this current issue. “The basic data (Cq) that is commonly used for subsequent qPCR analysis is not appropriate for all reactions. Alternatives need to be investigated before we continue to push qPCR out into regulatory and diagnostic environments, and especially as we venture into higher throughput.”
Although there still are a number of growing pains that plague qPCR in both the technology itself and analysis of generated data, new approaches continue to emerge to enhance and improve this popular tool.