February 15, 2012 (Vol. 32, No. 4)

Michael D. O’Neil

 

Progress in real-time quantitative PCR (qPCR) technology has been steady since its invention approximately 15 years ago. Recent innovations and where the technology is headed in the future will be discussed at a Select Biosciences’ upcoming conference on “Advances in qPCR”.

Jan Hellemans, co-founder and CEO of Biogazelle, emphasizes the critical importance of optimizing both the front end (experimental design) and the back end (data analysis) of the qPCR workflow. Biogazelle’s flagship product, qbasePLUS, is a software solution for the analysis of qPCR data.

Hellemans notes that “there are still misconceptions as to how to best address the problem of inter-run variation with the experimental design.” The key principles, he says, are to “avoid the problem if possible, minimize the problem if it is not avoidable, and to correct for any variation that should actually occur.”

It would be ideal to screen all samples for a given gene on the same plate, he says, adding that it is not necessary to screen the reference gene(s) on the same plate. “This is a common misconception,” he says, acknowledging that using the same plate for a gene is not always possible, particularly given the large number of samples in the increasingly large studies being carried out today.

Measures should be taken to ensure that the potential variation is as small as possible. These measures would include using the same qPCR instrument and Cq value determination software settings, using the same batch of reagents, and minimizing the plate-to-plate variation by standardization. When variation does occur, Hellemans says that at least one sample should be re-analyzed in two different runs to enable correction for the variation.

According to Hellemans, the use of imputation statistical methods (commonly used by statisticians, but not yet widely adopted in qPCR data analysis) is a useful approach to recovering crucial missing data from qPCR experiments. The gold standard for normalization of qPCR expression data is normalization against multiple validated reference genes, he says. With the increasing size of experiments, there is an increased risk of missing data from one or more of these reference genes due to technical failure. Imputation is an effective approach to recovering this missing data, he adds.


Medians provide a more robust measure for qPCR analysis when confronted with outliers. [Biogazelle]

Maximizing Optimization

Marina Guillet is the co-founder and vp lab director of TcLand Expression, which is developing companion diagnostics in immune-mediated disorders and biomarkers in solid organ transplantation. TcLand’s lead programs target rheumatoid arthritis and liver transplantation.

To ensure that a robust technological platform was used for biomarker development and validation, the entire qPCR process at TcLand was standardized and each step of the process optimized, Guillet explains. This included the development of stringent and robust standard operating procedures as well as a comprehensive knowledge of all sources of variabilities within the process, from the choice of the sampling procedure to data generation.

According to Guillet, there are a multitude of blood collection techniques to obtain RNA. However, only a few of these are compatible with the logistical and technical constraints associated with the usage of retrospective samples and large multicenter application.

PAXgene blood RNA tubes (PreAnalytiX) were used to extract RNA. The PAXgene systems are the only blood collection tubes and associated RNA extraction kits to be CE-marked and to have received 510(k) clearance from the FDA, Guillet explains.

In a recent 30 gene study, Guillet and colleagues demonstrated that it is possible to obtain trustworthy results in qPCR assays by paying particular attention to every single step of the workflow. The TcLand investigation was a nested study similar to those that have been used to identify the most important sources of error in qPCR experiments.

Advantages of Digital qPCR

Reginald Beer, a principal investigator and associate program leader at Lawrence Livermore National Laboratory, will discuss the advantages of digital qPCR at the meeting, as well as the potential for ultra-fast qPCR. Beer was reportedly the first to demonstrate real-time, droplet-based digital qPCR on a chip in 2007, and he also has developed an instrument capable of detecting both DNA and RNA targets in picoliter droplets.

Beer notes that digital qPCR involves partitioning targets in such a way that there is either one copy or no copies of the target in each PCR well. This offers significant advantages in absolute quantification as a result of the high accuracy in knowing the starting concentration of template due to Poisson statistics. The technology is complex compared to bulk PCR, however, and it is not appropriate for all qPCR applications, he reports.

Digital qPCR is ideal for low-copy number detection in a sample with a high concentration of interfering templates, such as applications involving rare mutation detection, absolute quantitation, copy-number variation, and haplotyping. Advances that Beer sees coming in digital qPCR include instrument and process simplification and higher throughput.

Emphasizing his belief that “speed will serve the broad scientific community,” Beer also touts his team’s recent work developing an ultra-fast PCR instrument that can complete 30 PCR cycles in under three minutes. Such an instrument would have numerous benefits, including higher throughput, faster results for clinical applications, and widespread utility in forensics and food safety.

The device reportedly achieves extremely fast thermal cycling, making possible heating and cooling rates of 45°C per second. “We were really encouraged by the fact that off-the-shelf polymerases worked at these speeds, and there are a lot of parameters that can be adjusted to potentially go even faster.” Dr. Beer and his team are currently working to develop an optical detector for an ultra-fast qPCR instrument.


Engineers test a new PCR instrument developed at Lawrence Livermore National Laboratory that can reportedly process biological samples in less than three minutes. [Jacqueline McBride/LLNL]

LATE-PCR

J. Aquiles Sanchez, a senior research scientist and an expert in cancer genomics at Brandeis University, has used LATE-PCR with Thermalight™ Lights-On/Lights-Off probes for the simultaneous qPCR mutation scanning of all four of the key EGFR exons (18–21) related to tyrosine kinase inhibitor (TKI) sensitivity/resistance in non-small-cell lung cancer (NSCLC). Current technologies for such mutation detection utilize DNA sequencing or high-resolution melting analysis, but are only able to interrogate one exon at a time, he explains.

With LATE-PCR, mixtures of different genotypes can be reliably distinguished down to the 10% level, even when the total number of starting copies is very small. The approach is “remarkably robust” and does not require quantification, Dr. Sanchez notes, adding that the clinicians he is collaborating with are ecstatic over the possible availability of this tool, which would permit rapid and cost-effective screening of small biopsy samples.

Nondestructive Isolation of mRNA

Karl Hasenstein, a plant physiologist and professor in the biology department at the University of Louisiana-Lafayette, is working to develop a technique for the nondestructive sampling of mRNA from cells. The goal is to allow for quantitating mRNA in different regions of cells and at different times without harming the cells.

Hasenstein says that his group uses a steel needle coated with oligo dT that can be inserted into cells to bind to the oligo dA tail of mRNA, and thus retrieve the mRNA without damaging the cell. The needles are actually acupuncture needles with a tip diameter of less than 15 microns.

To establish proof of principle, Hasenstein has used this coated needle approach to successfully collect mRNA and assess the distribution of two critical genes (BICOID and NANOS) in fruit fly egg cells. This coated needle-based mRNA sampling “can be completed in 60 seconds without damaging the specimen,” he says.

When the needles are coated reliably at the same density with oligo dT, the approach reportedly allows for absolute quantification without the need for reference genes. In addition, the success of this coated needle approach could “ultimately eliminate the need for biopsies that can have many negative consequences.”

A Discrete Dynamical System

Steven Smith, professor of molecular science, urology, and urologic oncology at the Beckman Research Institute and Medical Center at the City of Hope, is working on an algorithm that would address certain diagnostic anomalies in qPCR reactions. His algorithm treats qPCR as a discrete dynamical system and models qPCR using Michaelis-Menten kinetics over a 50 cycle experiment.

Use of this algorithm “may permit improved cycle optimization and better quantitation,” Dr. Smith suggests. He emphasizes that while most approaches to empirical or semi-empirical description of PCR utilize continuous functions, the data is actually discontinuous and a discrete approach to the analysis of this dynamical system has certain conceptual and practical advantages. The algorithm yields average values for dwell time, the maximal velocity, and the Michaelis constant of the DNA polymerase over the course of the reaction.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Previous articleFDA, CDC Budget Inches Up under President Obama’s Proposal for 2013
Next articleOrganovo Raises $6.5M to Advance 3-D Tissue Printing Platform