July 1, 2009 (Vol. 29, No. 13)
Strategies and Solutions for Overcoming Common Problems in Laboratory Experiments
The enhancement of PCR with real-time quantitative methods takes PCR from a workhorse laboratory preparatory method for molecular biology to an extremely powerful diagnostic and analysis tool for clinical and research applications. Studies of gene expression are some of the most common applications of qPCR. It is also increasingly used for clinical diagnostic testing. The technique is sensitive enough to detect DNA or RNA from an infectious disease, or even early stages of cancer.
Practical applications of real-time PCR, however, are plagued with problems and complications that slow down data collection or decrease the quality of results. These problems include contamination, amplification bias, poor efficiency and sensitivity, and unsatisfactory throughput or cost. CHI’s “Quantitative PCR” meeting held recently in San Diego brought the scientific community together to share strategies and solutions for making the best use of qPCR.
Back to School
Although technological improvements can enhance a qPCR experiment greatly, some of the most effective fixes are straight forward and easy to apply with any system. Marwan Alsarraj, product manager for Bio-Rad Laboratories’, gene expression division, taught a three-hour workshop at the meeting on optimizing qPCR assays. The course covered optimization of the assay and troubleshooting common problems. “By spending more time designing and qualifying a good primer set, the researcher saves time and prevents many pitfalls down the road.”
According to Alsarraj, running a standard curve by serial-fold dilution of template DNA with SYBR Green or Eva Green fluorescent dye can help provide valuable insight into the robustness of an assay, yielding important information such as the PCR efficiency, dynamic range, reproducibility, sensitivity, and specificity of the assay. This experimental approach could be used to ensure a properly optimized PCR assay.
This type of validation assay has helped many scientists improve the efficiency of their qPCR experiments. “I remember a scientist with a primer set generating a low PCR efficiency. By utilizing the melt curve, I checked the specificity of the reaction to make sure there were no primer dimers. We set-up a serial-fold dilution of the template DNA. It was clear by the number of peaks present that more than one product was amplified as the amount of input nucleic acid was dramatically decreased. When we got rid of this primer set and redesigned the primers, the PCR efficiency jumped from 60 to 95 percent.”
Good qPCR Starts with Good Primers
One of the most frustrating problems in the development of a qPCR assay is complications due to off-target amplicon formation. Undesired amplicon formation can happen from extension when one or more primers anneal to each other (forming primer dimers) or when primers anneal to undesired regions of the template with lesser complementarity (forming mispriming products). Off-target amplicon formation can significantly affect PCR performance by reducing PCR efficiency and by increasing the propensity for false-positive and false-negative results.
TriLink BioTechnologies has developed a strategy for chemically protecting the 3´ end of the dNTP. In her presentation at the meeting, senior staff scientist Natasha Paul, Ph.D., described this method including a proof-of-principle experiment. TriLink’s new hot-start PCR makes use of a tetrahydrofuranyl protecting group on the 3´ end of the dNTP. The use of 3´-protected dNTPs blocks the primer from being extended in the 3´ direction, with the protecting group coming off at the elevated temperatures of PCR. Thus, nonspecific amplification is prevented during the set-up stages, but PCR proceeds normally once the temperature is raised to 95ºC.
In a proof-of-concept experiment, Dr. Paul verified this mechanism by showing that 3´-THF dNTPs were not incorporated by DNA polymerase prior to hot-start activation. Furthermore, the utility of 3´-THF dNTPs was verified for a number of different targets and in real-time PCR, where amplicon detection was linear from five to 5,000 copies. “This technology provides a way to block any DNA polymerase of interest from being active until your PCR begins,” Dr. Paul said. “The introduction of these dNTPs into your reaction set-up will allow for conversion of your polymerase of choice into a hot-start polymerase.”
qPCR methods excel at detecting small amounts of nucleic acids, but by its nature the technique is vulnerable to contamination and amplification bias. Particularly when using complex samples such as clinical blood samples, there is a risk that contamination will introduce an amplification bias or that the wrong template entirely will be amplified.
Reginald Beer, Ph.D., a principal investigator at Lawrence Livermore National Laboratory (LLNL), presented his work on monodisperse picoliter PCR. LLNL studies viral and bacterial pathogens that threaten the population and food supply. “Viruses mutate quickly, constantly probing the immune defenses,” said Dr. Beer. “It’s hard to find causes of agents in new pandemics. When people first present sick, it’s hard for people in public health organizations to know what they’re sick with.” The goal is to reduce the time to treatment in a disease outbreak by optimizing diagnostic methods using qPCR. To this end, LLNL has harnessed microfluidic technologies to create chip qPCR.
Running qPCR in sample volumes of 10 picoliters not only reduces sample volumes and reagent volumes, it isolates the nucleic acids in the droplets and reduces PCR amplification bias. Standard or bulk PCR works quite well for oligonucleotides at high concentrations, but not so well when the concentration of target DNA or RNA is low and the background is complex.
A 10 picoliter droplet is 1,000,000 times smaller than a standard PCR sample volume. Poisson statistics predicts that this droplet will only contain one copy of viral DNA (or none) at typical concentrations, Dr. Beer said. That means that there is only one possible template in each reaction. Not every droplet will amplify, because not every droplet will contain DNA.
In their first publication, using qPCR with vaccinia, Dr. Beer and his team were able to come within 6% of the actual titer of the sample on the bench qPCR system. “Generally, when you talk to people in the field they are happy to get 40%,” he explained.
Strategies for Faster qPCR
The demand for qPCR data is such that the ability to run faster assays, with higher throughput and without sacrificing efficiency and sensitivity, is of high value. Researchers often turn to instrumentation, reagents, and kits to increase speed, but tend to overlook the smaller details. Thermo Fisher Scientific has studied the optimization of speed in qPCR and its findings were reported at the meeting by Ian Kavanagh, Ph.D., R&D manager of genomics. He showed that small details can make a big difference in overall performance.
One of the most significant ways that Thermo improved the speed and performance of assays was through the use of white plastic plates, Dr. Kavanagh said. Clear plastic is more commonly used, but white plastic reflects more signal back up to the detectors, giving a much greater sensitivity. A blue pigment in the master mix enhances the contrast between reagent and plastic and helps overcome the problem of not being able to clearly see the liquid in the wells.
The other interesting result from Dr. Kavanagh’s presentation was optimizing fast cycling by investigating specific traits of the PCR primers, but not necessarily requiring the use of specialized fast-cycling instrumentation. “As long as you can maintain the same sensitivity and reproducibility, anyone is going to want to gather their data rapidly so they can move on to the next stage of research.”
By the same token, most clinics and hospitals are eager to speed up qPCR-based diagnostic tests, but are limited by their equipment. Sundaresh Brahmasandra, Ph.D., vp of product development for Handylab, explained in his presentation how Handylab’s instrumentation overcomes common problems in a qPCR diagnostic assay system.
Most of these systems are physically quite large, and occupy a great deal of space in the laboratory. They are also often closed systems, meaning that users are limited to the tests offered by the manufacturer. The systems typically operate at high throughput and run a large number of tests at one time. However, in practice, this means that many hospitals and laboratories save up patient samples until they have enough to justify a large run. This effectively slows down the diagnosis.
Handylab has addressed these problems with the Jaguar benchtop system. The cost is competitive compared to the larger systems, Dr. Brahmasandra said. It can run up to 24 samples in two hours, he added, but the real advantage is that they do not have to be all the same test. In fact, they don’t have to be tests manufactured by Handylab at all. The user can run his or her own assays on the system, in whatever combination is necessary.
“The underlying technology enables all of this to happen. We have a unique extraction chemistry and can prep from a variety of matrices,” said Dr. Brahmasandra. The other ingredient is a microfluidic vessel, allowing the reaction to be carried out in a 4 µL reaction volume, which permits faster cycling. The entire amplification and detection is complete in 20 to 25 minutes. “You can run 24 specimens, and they can literally be different specimen types used for targets with 24 different thermocycling profiles in neighboring locations…it’s like 24 mini-instruments within the system.”
Improvements in efficiency and reductions in cost are a good sign that qPCR is a mature technology that is ready for prime time in the lab or clinic. There are many choices of methods and instrumentation, but the most powerful “upgrade” is in the knowledge and experience of the operator in setting up a well-designed, fully optimized assay.