Evolution of Polymerase Chain Reaction


October 1, 2011 (Vol. 31, No. 17)

Carl T. Wittwer, M.D., Ph.D.

Seminal Technology Continues to Be a Work in Progress

Since the discovery of the polymerase chain reaction (PCR) by the oligonucleotide chemist Kary Mullis in 1983, the method has revolutionized molecular biology and clinical diagnostics.

Before PCR, DNA amplification required multiple steps, including cloning into plasmids, insertion into bacteria, bacterial growth, isolation of plasmid DNA, and separation of inserts from plasmid vectors.

In contrast, PCR is performed in vitro as a single step, requiring only two oligonucleotide primers, a polymerase, and temperature cycling of the DNA template in the presence of deoxyribonucleotides.

Although its spread was initially limited by restrictive patent policies, the basic method is now off patent and has become a democratic cornerstone of molecular biology. Thousands of scientists have contributed to and expanded the methods and applications of PCR, including quantification of transcripts after reverse transcription and PCR followed by cycle sequencing.

Anyone can perform PCR with generic reagents and simple laboratory instruments, amplifying specific DNA segments by 106-to 109-fold for further study in genetics, oncology, and infectious disease.

Carl T. Wittwer, M.D., Ph.D.

Room for Improvement

Initially, PCR was slow, expensive, and required further analysis by gel electrophoresis. Even after automated thermal cyclers (PerkinElmer, Life Technologies) and thermally stable polymerases (Roche) were developed in the late 1980s, PCR protocols typically required four hours for amplification.

The feasibility of rapid cycle PCR (30 cycles in 15–30 min) was shown in the early 1990s and commercialized as the RapidCycler (Idaho Technology) with eventual adoption into the real-time capillary LightCycler (Roche) and SmartCycler (Cepheid) platforms. Amplification speed was limited by sample geometry and heat transfer, not reaction chemistry.

Nevertheless, microwell plates became the dominate format, reflecting user preference for convenience and automation over speed and temperature control. A trend toward faster amplification was recently rebranded as Fast PCR, often packaged with presumed enabling reagent modifications.

However, amplification speed remains instrument limited, not reagent or reaction limited.

Similar to the order of magnitude change in speed, the cost of PCR has diminished at least an order of magnitude. After the original Taq polymerase patents were deemed unenforceable, reagent manufacturers turned their focus to improved enzymes and hot-start methods to add value.

Physical separation of critical components with wax, inactivation of the polymerase with an antibody, heat-labile enzymes, primers, and dideoxynucleotides have all been commercialized to improve specificity. Although not necessary for the majority of applications, such improvements may compensate for poor reaction design and execution.

Twenty years ago, reaction volumes were typically 100 µL. Today, 10 µL is commonly used and microfluidic systems run a thousand times smaller yet at 1–10 nL per reaction.

Digital PCR can be performed to take advantage of limited template copies per reaction, or preamplification can expand the sensitivity limits inherent to small volumes. Even picoliter reactions are now routinely formed as emulsions or controlled droplets within oil, enabling massively parallel sequencing by expansion of individual templates or production of target libraries.

To the extent that reagent cost is proportional to reagent volume, the expense of individual reactions should be vanishingly small. However, in the case of digital PCR, many reactions are necessary for quantification, and droplets can only be analyzed individually with expensive equipment. Furthermore, if each reaction requires specific oligonucleotides, the initial synthesis costs remain the same and can be oppressive when many targets are analyzed.

Vials containing DNA that was produced using the polymerase chain reaction technique. Real-time PCR, in particular, has become popular because of its ability to integrate separate amplification and analysis steps without large-scale automation. [ Peter Menzel/Photo Researchers]

Sample Runs

As reaction volumes have decreased, the number of samples run at one time has generally increased. Microtiter trays of 96-wells are standard, and 384-well and 1,536-well instruments are available. A microfluidic matrix of 96 assays and 96 samples amplifies over 9,200 individual reactions on one chip (Fluidigm). Bridge amplification tethers PCR primers to a surface and can amplify millions of PCR clones prior to massively parallel sequencing. Of course, more reactions are not necessarily better and value does not always scale linearly with the number of samples.

More important than speed, cost, and batch size is the information (both qualitative and quantitative) that PCR enables. Historically, gel analysis was performed after PCR for size analysis of the products. Such “open” analysis provides flexibility but also the opportunity (or some would say, guarantee) that PCR products eventually contaminate the PCR reagents.

In contrast, real-time PCR is homogeneous and closed without any separation or processing steps. Quantitative information on the amount of initial template is obtained during PCR by monitoring fluorescence each cycle, correlating the log of the initial template concentration to the quantification cycle (Cq).

The term Cq replaces Ct (cycle threshold) and Cp (crossing point) as introduced by MIQE (Minimum Information for Quantitative PCR Experiments), an attempt to standardize and provide guidelines for performance and interpretation. Quantitative reverse transcriptase PCR remains the gold standard for transcript quantification and is often used to validate results obtained by other methods.

Fluorescent indicators that correlate with the amount of PCR product are necessary for real-time PCR. Indicators can either be dyes that change fluorescence in the presence of double-stranded DNA or oligonucleotide probes that change fluorescence by hybridization and/or hydrolysis.

Functional classification of real-time probes as hybridization or hydrolysis probes is more useful than trademarks. Hybridization probes reversibly change fluorescence with hybridization and typically consist of one or two probes per target.

Single hybridization probes change fluorescence after a conformational change of the probe or hybridization to the target. Dual hybridization probes change fluorescence by approximation of two fluorophores and resonance energy transfer. Hydrolysis probes generate fluorescence irreversibly by polymerase cleavage of the probes between two fluorophores.

Most fluorescent indicators are not linearly correlated with the amount of product, but this does not compromise their function in quantitative PCR. The cumulative signal from hydrolysis probes may provide a sensitivity advantage, but the ability for reversible melting analysis (characteristic of hybridization probes) is lost.

Probe-melting analysis, which is popular for clinical genotyping, partly because multiple variants can be identified under one probe, was used in the first FDA-approved genetics tests for F5 and F2 genotyping in 2002.

Commonly Used Dyes

Although probes have the advantage of added specificity, dyes are more commonly used because of convenience and cost. Furthermore, PCR product melting analysis largely equalizes the specificity difference between dyes and probes.

Melting analysis of SYBR Green I, first introduced with the LightCycler in 1997, is now widely used on all instruments, although it is sometimes rebranded as dissociation analysis. The advantage of melting as an analytical technique is that, just like real-time PCR, it requires fluorescence detection and temperature control and can be performed in the same instruments.

More recently, melting analysis has become even more powerful as its resolution has increased. High-resolution DNA melting analysis (also known as HRM) is now accepted as the best variant scanning method and the simplest method for genotyping.

Genotyping is performed by bracketing the variation with a small PCR product that has a melting curve shape and/or position that depends on the genotype. Variant scanning is performed with dyes that detect heterozygotes (saturation dyes) as a change in melting curve shape.

When first introduced, melting analysis in conjunction with PCR was a simple check for specificity summarized by its melting temperature, or Tm. A pure PCR product was expected to have a single transition and a characteristic Tm.

This simple assumption turned out to be wrong for many PCR products; multiple domains are often detected with high-resolution melting and can be accurately predicted. This increased precision of DNA melting provides an analytical tool that is replacing gel electrophoresis for many applications.

In addition to PCR product melting, probe melting analysis is possible without fluorescently labeled probes. Using the same dyes that detect PCR products, the melting transitions of unlabeled probes or internal secondary structure can be monitored. Unlabeled probes require a 3´-end block to prevent extension, but otherwise avoid any covalent modifications.

If unlabeled probes are attached as 5´-tails to primers, they do not require blocking at the 3´-end. After PCR, these, “snapback primers” form intramolecular hairpins with the melting of the stem sensitive to the internal sequence. When unlabeled probe or snapback primers are included in PCR, both product scanning and specific genotyping can be performed.

Integrative Capabilities

Part of the appeal of real-time PCR and melting analysis is to integrate previously separate amplification and analysis steps without large-scale automation. Further integration of sample preparation into small diagnostic devices has now been FDA approved by at least two companies, the GeneXpert (Cepheid) and the FilmArray (Idaho Technology).

The FilmArray, for example, achieves sample disruption (bead beating), nucleic acid purification, reverse transcription, multiplex PCR for preamplification of 15–50 targets, secondary individual PCRs on a mini-array, and high-resolution melting analysis, all in less than one hour.

In summary, we have come a long way in making PCR faster, cheaper, and better. This article is necessarily biased by my own experiences, and my apologies if I have left out your favorite PCR derivative or direction. The fact that there are so many supports the power, flexibility, and value of the fundamental method. PCR will remain as one of the basic tools of the genetic engineer as an elegant and robust method for targeted amplification.

What of the future? The speed of PCR and melting is still limited by temperature measurement and control in commercial instruments. Temperature protocols are not transferable between manufacturers and most users typically think under the equilibrium temperature/time paradigm rather than the kinetic reality of constantly changing temperature necessary for rapid PCR.

Users and manufacturers have, in the past, been more concerned about numbers of samples (batch size) than speed and quality of the reactions. This historical choice of quantity over quality provides an opportunity going forward to improve temperature control, and thereby, speed and reproducibility between samples and instruments.

The potential of high-quality PCR and melting in less than 10 minutes is real. Other areas open to further progress are design and prediction tools focused on PCR. Although many primer and probe designs are available, few if any of the commonly accepted rules are based on empirical evidence.

Prediction tools for Tm are often inaccurate and complicated by the unfortunate trend of manufacturers to include proprietary ingredients that are not disclosed and users’ willingness to purchase them. Perhaps it is another tribute to PCR that it works so often given the current limitations of the state of the art.

Carl T. Wittwer, M.D., Ph.D. (carl.wittwer@path.utah.edu), is a professor in the department of pathology at the University of Utah and an adjunct professor in biomedical engineering at the same institution.

This site uses Akismet to reduce spam. Learn how your comment data is processed.