|Send to printer »|
Feature Articles : Mar 1, 2010 ( )
qPCR Trends, Techniques, and Tools
Faster, More Efficient, and Less Expensive Way to Analyze Gene Expression Is Ultimate Goal!--h2>
The fast-paced, continually evolving field of quantitative PCR (qPCR) will be showcased at a CHI meeting in San Diego next month. Presenters will describe cutting-edge methodologies and emerging technologies such as digital microfluidics, nanopore and single-molecule sequencing, and improved ways to standardize data.
Traditionally, the first step in gene-expression studies is to purify RNA from samples. However, RNA isolation can be a problem for some applications. Gregory L. Shipley, Ph.D., assistant professor and director, Quantitative Genomics Core Laboratory, University of Texas Health Science Center, says that utilizing cell lysates for real-time qPCR is a helpful alternative. “Isolating RNA can be an expensive and time-consuming process. For small amounts of tissue or low numbers of cells, RNA purification isn’t practical.”
Dr. Shipley suggests that a better way is to utilize cell lysates directly for cDNA synthesis. “Lysates not only act as effective templates for amplification, they provide some added advantages. First, one can easily utilize a very small number of cells for real-time qPCR. If you tease out cell populations from a lung tissue sample, there are lots of cell types present. Typically, you might get a specific population of only about 20,000 to 100,000 cells using flow cytometry. It isn’t practical to isolate RNA from such a small number of cells, especially for analyzing multiple transcripts. Secondly, isolation of RNA can potentially skew the population whereas with cell lysates nothing is lost.”
According to Dr. Shipley, there are some drawbacks that must be taken into consideration. “When using cells directly, the benefit is that everything is in there. However, the disadvantage is that everything is in there. It is important to lyse cells in the presence of agents that will inhibit RNA degradation and to have a way to get rid of contaminating genomic DNA.
“Several manufacturers offer kits for making cell lysates for real-time qPCR and more are jumping on the bandwagon. The challenge for the future will be to figure out how to use small amounts of tissue directly for qPCR. Currently, this works less well in our hands.”
Although gene-expression analysis by reverse transcription quantitative PCR (RT-qPCR) provides an accurate and sensitive means to measure gene expression, inherent biological variability can cause problems. “In the process of establishing a reliable qPCR, I found that biological variation between different samples taken at different days or in different animals was causing problems,” explains Erik Willems, Ph.D., research associate at the Neuroscience, Aging and Stem Cell Research Centre of Burnham Institute for Medical Research.
Dr. Willems then began working with an expert in biostatistics, Professor Jo Vandesompele from the Ghent University in Belgium and co-founder of the qPCR data-analysis company Biogazelle. “We came up with several ways to address these interexperimental variations and developed a set of easy corrections that anyone can perform on their own data. This is a simple but very important contribution to the field as people with difficult biological samples often tend to show their ‘best’ graph, rather than showing the average of a larger sample set.”
The standardization procedure involves three steps according to Dr. Willems. “The first step is to use a logarithm transformation. Basically by putting all the data to a log scale, you reduce the effect of outliers because such a scale gives equal weight to all data points.
“The second step is to perform a mean centering. This basically means that we correct for differences in the control levels of a certain gene, so that the untreated conditions are all leveled for their gene expression. Each data point is divided by the average of all data points, a typical approach used in microarray data correction.
“The third step is to perform autoscaling. Here all data points are divided by the experimental standard deviation. This step corrects for differences in magnitude of differential gene expression between a set of biological replicates.”
After tweaking the data, Dr. Willems says the final analysis can be run in any statistical test of choice such as a simple t-test. “This three-step standardization procedure can be easily performed by anyone somewhat knowledgeable in a spreadsheet program such as Excel. It should allow people to show their qPCR data with much more confidence.”
The miniaturization employed in qPCR microfluidic technologies and other applications allows shorter time to results, integrates sample preparation, and makes fluid-handling needs portable. A new version called digital microfluidics promises to continue the microfluidic revolution. This technology employs electrowetting of droplets, a phenomenon that modifies the surface tension of liquids on a solid surface using a voltage.
“Digital microfluidics differs from traditional microfluidics in that there are no pipes, pumps, or valves,” reports Michael Pollack, Ph.D., co-founder, of Advanced Liquid Logic. “Instead, discrete droplets are manipulated electrically, using electrodes to independently control each droplet. Digital microfluidics enables extremely flexible Lab-on-a-Chip devices that can be configured in software to execute virtually any assay protocol.”
One application is qPCR and point-of-care (POC) diagnostics. “For PCR, one can place a chip onto a stage containing two or three heaters that create different temperature zones on the chip. Thermal cycling is carried out by rapidly transporting a droplet back and forth between the temperature zones. PCR speeds are usually limited by how fast one can heat and cool rather than the speed of the chemical reaction. Because the volume that is being heated and cooled is a droplet as small as 300 nanoliters, the temperature changes are very rapid. We have been able to push the limits of a 40-cycle PCR to about five minutes. For a robust commercial system, that will probably end up at 10–15 minutes per reaction, which is still a great improvement over conventional PCR.”
Dr. Pollack says that challenges remain for making the technology viable for POC applications. “POC devices need to integrate all process steps from sample to result. Another challenge is cost. However, a great advantage of our implementation is that the chips are fabricated on printed circuit boards for which a huge manufacturing infrastructure already exists. Therefore they can be made inexpensively enough to be single-use disposable devices.”
Next-generation sequencing technologies have revolutionized genomics in their ability to generate enormous amounts of sequence read-outs in a single run. However, accurate quantification of the input DNA library is critical in order to assure the optimal performance of the sequencing instrument and to reduce overall costs.
“Sequencing of DNA libraries can take several days to run, so it is important to obtain the most information you can during a run,” Bernd Buehler, Ph.D., senior scientist at Agilent Technologies, says. “For example, on the Illumina Genome Analyzer, the cDNA library is immobilized and amplified on a chip in a process called cluster generation. If the amount of DNA loaded is too high, the DNA clusters could overlap and produce poor quality data. On the other hand, loading too low of an amount will result in low cluster density and reduce overall efficiency.”
Dr. Buehler says that using qPCR can improve both the accuracy and efficiency of the process. “We have done studies looking at the correlation of qPCR results with our High Sensitivity DNA Quantification Kit on the 2100 Bioanalyzer. In all cases, qPCR proved to be a complementary method for quantifying high-quality library preparations.”
Additionally, qPCR provides several other advantages, Dr. Buehler says. “qPCR is very specific and only detects adapter-containing sequences since the identical primer sequences are used for library amplification. Also qPCR is exquisitely sensitive and can be used to help avoid amplification bias by reducing or eliminating the need to amplify DNA before sequencing.”
However, utilizing an accurate standard is critical. “Principally, two sources of a standard can be used for quantification, such as a previously quantified DNA library that was successfully sequenced, or a plasmid standard. The latter has the advantage since the standard can be easily reproduced for future use.”
“qPCR assays may provide an ideal solution for quantification of sequencing libraries,” Dr. Buehler says. “We feel that more and more researchers will be using this strategy in the future.”
GE Global Research is developing a third-generation sequencing technology that aims to be less expensive and more accurate. “We are working on a means to interrogate DNA that can be used for high-throughput, single-molecule DNA sequencing,” John Nelson, Ph.D., molecular biologist, reports. “This method can utilize less expensive optics by sequencing clonally amplified DNA, or alternatively, can be developed for a high-end system that can interrogate single molecules.”
The proposed method is a sequencing-by-synthesis strategy. “This uses two separate innovations together to accomplish DNA sequencing. First, we had already developed terminal phosphate labeled nucleotides with a dye attached that incorporates and is then removed from the growing DNA strand. Next, we developed a method to ‘freeze’ the DNA polymerase as it is incorporated into this unique nucleotide.
“In this state, the three-part complex of DNA strand, polymerase, and incoming base are quite stable, and can be washed and interrogated to determine the identify of the trapped base by the color of the dye in the complex. After interrogation we allow the polymerase to add that single base and proceed on to form the complex on the next template base. This cycle is repeated over and over.”
According to Dr. Nelson, the new method has several advantages. “For instance, our method can be used to step DNA synthesis even through homopolymers, reducing some of the issues associated with blocks of identical bases. The method can also be used for single-molecule sequencing and provides a stable signal that can be interrogated carefully but requires no chemistry besides DNA polymerization to completely remove the label used. Since we add a new enzyme for each step, enzyme stability is not an issue.”
GE Global Research has demonstrated proof-of-concept for the method. It is now optimizing various aspects of its new sequencing system.
Nanopore DNA Sequencing
Another emerging cost-cutting technology is called nanopore DNA sequencing. According to Andrew D. Hibbs, Ph.D., CEO of Electronic BioSciences, “this is a revolutionary technology in its early stages. The idea is to be able to identify DNA bases directly as they move through a nanopore formed in a solid material or via a protein structure. The approach measures a single DNA molecule; there’s no need to amplify the DNA. You can move from days to hours to get the same information and the equipment is simpler.”
Current technology employs the use of reagents that are consumed during the sequencing process and must cut the DNA into short strands requiring extensive processing to assemble the sequence. In contrast, the new technology is composed of a nanopore separated by two solution-filled chambers containing the target DNA.
When a small (~100 mV) voltage is applied, DNA passes through the nanopore producing a signal that can be measured with standard electrophysiological techniques. Thousands of DNA molecules can be measured by a single pore with no inherent limitation on the number of bases in each molecule.
“Clearly this has the potential to save much time and money and even meet NIH’s goal of the $1,000 genome by 2014,” Dr. Hibbs says. “If successful, it could allow DNA sequencing to become a routine part of patient care. In addition to personalized medicine, it is likely that DNA modifications linked to development of cancers (such as methylation) can be determined via their nanopore electrical signal.”
According to Dr. Hibbs, to make this emerging technology more commercially viable will require three to five additional years of development. “Although polybases are now able to be distinguished, the technology must progress to seeing them individually along a strand. Companies are pursuing proprietary technologies to do this and to cash in on what will likely be a very lucrative area.”
The field of qPCR continues to provide alluring opportunities for a faster, more efficient and less expensive means to analyze gene expression.
© 2016 Genetic Engineering & Biotechnology News, All Rights Reserved