Quantitative real-time polymerase chain reaction (qPCR) continues to gain popularity as the premier technology for gene-expression analysis. While new strides are being made to refine and improve qPCR, problems remain—especially the need to improve consistency and build confidence in data analysis. Select Biosciences’ “qPCR Europe” conference held last month in Dublin addressed current issues and highlighted emerging tools and strategies for this continuously evolving field.
Mark Behlke, M.D., Ph.D., CSO of Integrated DNA Technologies reported on the company’s new technology to enhance the performance of qPCR. “A central strategy common to many qPCR methods is the use of complementary hybridization probes bearing a 5´-dye and a 3´-quencher.
“Following hybridization with the target, the probe becomes degraded and the amount of fluorescence generated is proportional to how much target is present. Sensitivity is improved by having a high signal-to-noise ratio. Good quencher performance will help lower background, improving sensitivity and assay performance. But often, this can be a challenge.”
According to Dr. Behlke, many strategies have emerged over the years to enhance qPCR sensitivity. “Quenchers are normally placed at the 3´ end of the probe, which is usually 25–30 bases away from the fluorescent dye. One strategy to improve the quality of quenching is the use of molecular beacons that have hairpin handles that fold the probe, forcing the dye and quencher to be closer.
“Another strategy is to place the quencher internally within the probe; however, this approach usually destabilizes hybridization and can hurt probe performance instead of improving it. The use of Tm-enhancing chemical modifications allows probes to be shorter, thereby improving quenching, but these methods always increase cost.”
IDT recently developed a new double-quenched probe design that employs a dark quencher (ZEN™) that is inserted internally within the probe, 9–10 bases from the 5´ fluorophore. The design also keeps the standard 3´ quencher, which, in addition to quenching, also blocks the 3´ end and prevents the probe from acting like a primer.
“The new probe design is particularly effective in enhancing sensitivity and reducing background,” Dr. Behlke explained. “Importantly, the ZEN quencher actually stabilizes probe hybridization to target, increasing Tm. This is quite novel, as all other quenchers we tested were destabilizing when inserted in the middle of a probe. The other piece of good news is that the ZEN modification is a robust chemistry and is surprisingly straightforward to manufacture, allowing us to make our dual-quenched ZEN probes with no significant cost increase.
The company is utilizing the new technology in its PrimeTime® qPCR assays, in which the primers and probes are provided in a single-tube format.
Proximity Ligation
Life Technologies has created a new technology to systematically quantify proteins within a small sample by coupling antibody-mediated protein binding with qPCR quantification. According to Mark Shannon, Ph.D., senior staff scientist, “TaqMan® Protein Assays are an extension of proximity ligation assays (PLA™).
“In our system,” he went on to say, “the assay probes are target-specific antibodies that are conjugated to two different oligonucleotides through a biotin-streptavidin linkage. When the antibodies bind their target, the oligos come in proximity of each other. Addition of a connector oligonucleotide and DNA ligase creates a DNA amplicon, which is amplified in a qPCR reaction. The qPCR results correlate with the amount of protein in the sample.”
According to Dr. Shannon, this method for rapid quantification of proteins can be applied to formalin-fixed, paraffin-embedded (FFPE) and frozen samples. “The assays performed equivalently on both types of samples. This is important because there are vast archives of uncharacterized FFPE human tumor specimens.
“Often these are analyzed using immunohistochemistry, but that is much more labor intensive and much less quantitative. Thus, studies now can be conducted with greater ease and throughput with actual tumors. This will allow a better understanding of the protein profiles of cancers, and thus potentially identify new therapeutic biomarkers.”
Another benefit of the new technology is its ability to study interdependencies of proteins. “For example, we can use a model system to study the interplay of proteins in a disease pathway. As we knock down certain transcripts, we can monitor effects not only on the corresponding proteins but also on proteins in downstream pathways. Our technology enables a more systematic analysis of multiple proteins within a small sample that is highly sensitive and quantitative.”
The company initially provided six premade assay kits but recently began marketing an open-kit format to allow investigators to couple their own antibodies of interest.
Analyzing Genetic Variation
When double-stranded DNA (dsDNA) is heated, it unwinds into single-stranded counterparts. This property is integral to any PCR reaction. However, dissociation of dsDNA is also used in high-resolution melting (HRM) analysis. In this case, an amplicon is first generated to the area of interest by PCR in the presence of a dye that only emits light when bound to dsDNA. As the temperature is increased, the fluorescence decreases, creating a melting curve in real time that can be compared to a standard sequence to identify subtle genetic variations.
“The high precision melting of PCR products depends on several parameters such as base composition and distribution, as well as overall amplicon length,” explained Andreas Missel, Ph.D., associate director of R&D at Qiagen. “The melting curve delivers a characteristic profile, allowing resolution of fine differences for analyzing genetic variations such as for SNP genotyping, mutation screening, and DNA-methylation patterns.”
There are a number of factors that are critical to the success of HRM. “Reliable HRM analysis requires a suitable HRM instrument, the proper chemistry, and specific data-analysis software.” Dr. Missel noted. “There is also a stringent requirement for temperature uniformity in instruments. Often there can be subtle temperature differences within the qPCR thermocycler that significantly affect results.
“Another problem is that preparations done with ‘home-brew’ DNA isolations contain contaminants such as salts, phenol, or ethanol that alter curves. Additionally, data interpretation can be a challenge. There is still some room for improvement in this area. Often the HRM analysis software that comes with a PCR instrument requires some arbitrary user intervention.”
To streamline the process, Qiagen offers tools such as the Type-it HRM PCR and EpiTect HRM PCR kits that do not require optimization, and a thermocycler with accompanying software. “In our thermocycler, the Rotor-Gene Q, each tube spins in a chamber of moving air, keeping all samples at precisely the same temperature during rapid thermal cycling. This provides the high consistency needed for applications such as HRM.”
The future of HRM analysis will likely be “quite interesting,” predicted Dr. Missel. “There were about 50 publications in the literature on HRM four years ago. In 2009, there were about 500. Therefore, users are exponentially increasing, and there will presumably be many more types of applications for HRM on the horizon.”
Single-Cell Challenges
Utilizing qPCR for single-cell analysis remains a significant hurdle, according to Martina Reiter, Ph.D., CEO of BioEPS. “Although single-cell analysis is not new, we still don’t have a final solution as to how to handle it. The biggest problem is technical variance during the experimental steps, especially in sampling analysis and subsequently data analysis.”
“Often samples are generated from laser capture microdissection and flow cytometry sorting. Sample amounts are very small and variances are huge. When you don’t pay attention in the beginning to such variances, downstream processes are even more inaccurate.”
What about solutions? “We are at the beginning of working on this and are continuing to determine the right way to do it,” Dr. Reiter said. “Substantial improvements in the analysis steps are needed. You cannot use standard analysis methods; they need to be modified for single-cell qPCR. But it is still difficult to modify so that technical variances are reduced as much as possible. Despite this, qPCR technology is sensitive enough to detect specific genes out of one cell.”
The MIQE guidelines are a newly proposed set of suggestions to standardize and document qPCR. “MIQE does not cover specific single-cell analysis so far, and qPCR advances need to be made in documenting and determining RNA integrity, in how to handle the necessity of preamplification, in the ability to more accurately sample cells, and in downstream analysis methodologies.”
Recognizing the importance of tackling these issues now will especially impact the future of therapeutics. “Ultimately, these problems will be solved,” Dr. Reiter predicted. “For the present, we need to realize that while qPCR for single-cell analysis is possible, especially technically, the larger issues of experimental design and variation remain and should be addressed.”
Quantification Issues
Accurate determination of gene expression in qPCR is facilitated by assessing the so-called threshold cycle (Cq) at which the level of fluorescence signal exceeds the overall background. This is measured in the linear portion of the amplified curve. Thus, the Cq value is currently responsible for quantification in qPCR. However, there are often fundamental flaws in this analysis, according to Chaminda Salgado, head of PCR services at NDA Analytics.
“One has to ask if your Cq is really telling you the truth. Although Cq should be the cycle at which one can discriminate signal above noise, often the user defines this threshold. There are also issues surrounding alternative algorithms, designed to be nonsubjective. One of the problems is that the Cq is only a single data point and does not utilize all the available data from a real-time assay. This is a fundamental flaw.
“Another issue is that the baseline, often set universally across all reactions, is not appropriate for every reaction. These two combined issues can produce unreliable data that no amount of statistical evaluation or MIQE compliance can correct. I estimate, on most instruments, around 10 percent of the reactions are prone to such issues. The current Cq and baselining methods have not significantly evolved in over 14 years, since the first commercial real-time instruments were launched. This is due to vendor focus on addressing requirements of applications for which qPCRs are used.”
Salgado strongly cautions all qPCR users to be aware of this current issue. “The basic data (Cq) that is commonly used for subsequent qPCR analysis is not appropriate for all reactions. Alternatives need to be investigated before we continue to push qPCR out into regulatory and diagnostic environments, and especially as we venture into higher throughput.”
Although there still are a number of growing pains that plague qPCR in both the technology itself and analysis of generated data, new approaches continue to emerge to enhance and improve this popular tool.