February 15, 2007 (Vol. 27, No. 4)

Trevor Stokes

Single Nucleotide Polymorphisms Face Several Hurdles Before Directly Impacting Healthcare

In 1998, the first international meeting on SNPs and complex genome analysis was held in Sweden. Since then, SNP technology has become more widely adopted. Clinical expectations remain high for diagnostic and pharmacogenomic uses of SNPs, and researchers are currently awash with data. But, according to several experts, it will still be years before SNP technologies directly impact healthcare.

Before SNP technology makes it to the clinic, it faces several challenges. According to a report from BioInformatics (www.gene2drug.com), one significant challenge facing the field of SNP genotyping is the difficulty of selecting an optimal genotyping method, as both the scale and the scientific question a project is trying to answer heavily influence the choice of genotyping method.

The consensus is that the more SNPs, the fewer samples and, vice-versa, the more focused the SNPs, the more sample throughput. From high-throughput microarray-based technologies to low-throughput single-tube techniques, such as single nucleotide-based extension, researchers must compromise.

“Some technology platforms are optimized for high-throughput and a high number of SNPs, but low-throughput in the number of samples,” says Phoebe White, senior director of sequence detection systems/arrays at Applied Biosystems (www.appliedbiosystems.com). “How big the study is in terms of SNPs and samples is what drives technologies.”

Researchers face alternatives at every step of the process, from the selection of which molecular technique to use, to a myriad of choices for labeling, detection, and scoring.

SNP Sources

Much effort has been focused on human SNP discovery in the past ten years with impressive results. The latest release from the International HapMap Project contained nearly seven million SNPs from three populations. In the BioInformatics report, it is noted that greater than 70% of SNP genotyping is done on samples of human origin and one-fifth of samples are murine-based.

“The problem with published SNPs is that there is a lot of redundancy,” says Zhiming Jiang, product manager for SNPstream at Beckman Coulter (www.beckmancoulter.com). “The same SNP could be presented by different identification numbers.” Further, a lot of the SNPs are not validated and can be misleading if they have different frequencies in different populations.

Though scientists still need to validate massively produced data, researchers who study the human genome still benefit from publicly available data from the HapMap and Human Genome Projects. However, for other species, data availability can be limiting.

“If you look at agriculture, plants, or animals, a lot of the genome sequences are confidential or are owned by private companies,” Jiang notes. Less a problem for small boutique projects, data restriction clearly can leave nonhuman species behind in SNP resources.

Species with rich datasets, like humans, still have room for SNP improvements in the form of validation. “We think that the biggest development in the past year or so in terms of SNP assays is the validated assay,” explains Tamara Zemlo, director of syndicated research and analysis at Bioinformatics. “This is an interesting phenomenon because we have observed it in other technologies that use oligos.”

Prepackaged SNP assays sometimes can be cheaper than designing custom oligos. “They have proven performance, and that is what people are really needing in their experiments,” says Zemlo.

Data Analysis

Even with validated SNPs and a robust technology range, at the high-throughput level, storage and analysis of these massive datasets is still a point of contention.

“What to do with the volumes of data that come out of these chips and interpreting the data are going to be two of the most critical things to solve,” says Todd Dickinson, director of product marketing for (www.illumina.com).

“I wouldn’t say it is controversial, but it is a point of discussion,” adds Dickinson. “There are conferences that literally do nothing but study how people look at these data and how to perform statistical analysis in a proper way.”

Several companies already provide initial SNP-genotyping analysis as part of their platforms, but so far no definitive analysis standard has been set. SNP analysis still needs research, but as Jiang notes, “it is an area of opportunity as well.”

Analysis at the whole genome level may become increasingly important as researchers use high-throughput methods at the exploratory phase of their research programs. “A lot of labs start with whole genome SNP analysis, maybe 100K, maybe 500K, maybe even 1 million SNPs. The density just keeps going up,” Jiang adds. “Whether you are drawing the right conclusion is very challenging. That is what we call the bottleneck of the entire flow now.”

Previous high-throughput experiments with mRNA microarrays have laid some of the groundwork for SNP genotypes, but researchers are still demanding greater ease of data storage and visualization.

However, some researchers argue that the very nature of SNPs eases data analysis. Instead of a spectrum of values, such as mRNA quantification, SNPs provide researchers with data that is either there or not. Kevin Munnelly, senior director and business unit manager for genomic products at BioTrove (www.biotrove.com), notes that customers who use the company’s OpenArray™ NT Imager Genotyping System do not have problems with the data delulge that exist with other technologies and applications.

“You get three possibilities, three populations of data—not so difficult to deal with, even with the volume,” says Munnelly. “With SNP genotyping, if it is done correctly, all you really need to do is say it is either allele one, allele two, or both.” In one example, an experiment required 64 SNPs tested in a population of 25,000 patients. Munnelly reports that even in that case, where the data volume was so great, it was easy to handle.

As project datasets are getting larger and larger, one of the most prevalent underlying issues researchers face is cost. “People want accuracy, but they don’t want to pay lots of money,” notes Zemlo. A SNP genotype used to cost $1.00, but experts now say that the cost ranges from $.03–.35 per genotype.

Several methods to lower cost exist, including reagent reduction, automation, higher throughput, and education. “Most of our marketing efforts are in education to help people understand the real advantages of different genotyping systems,” says Munnelly, “Even though you may think of the reagent cost as being the most important thing, other factors, such as labor costs, are also important.”

Cost does come with a flipside. “Accuracy is becoming a more important factor in some cases than the economics of the assay,” notes Richard Eglen, Ph.D., vp and general manager of discovery and research reagents at PerkinElmer Life and Analytical Sciences (www.perkinelmer.com).

In some cases, accuracy is the first concern of researchers, whereas cost is secondary. One argument is that increased accuracy lowers cost in the downstream pipeline.


One trend for single-tube reaction SNP genotyping that can reduce cost is multiplexing, where more than one SNP can be assayed in the same space, whether that is a tube or a speck of liquid. “We have seen an increased focus on multiplexing where the requirement is moderate throughput of a sample,” Dr. Eglen says. “This allows researchers to get more information on a limited number of tissue samples.”

Multiplexing doesn’t work for whole genome scans, but, according to Jiang, “Multiplexing is important for the signature validation stage because it drives down the cost.” Beckman’s platform currently has 12-plex and 48-plex capability.

According to Dr. Eglen, “The more plexing we can do in an assay, the more appropriate we feel it is going to be for the marketplace. Our goal is to get more information out of the same sample.”

DNA Comparative Hybridization

One interesting innovation comes not from SNP research at all, but from researchers identifying chromosomal abnormalities such as indels and copy number changes.

“The latest trend is using whole genome SNP arrays to garner chromosomal aberration information and to study DNA copy number,” says Dickinson. “The advantage to using SNP-based approaches over oligo array CGH-type approaches is that not only are you obtaining information about DNA copy number, whether it be amplifications or deletions in DNA, but you are also generating genotyping information at the same time.”

Several product developers have seen a shift toward researchers using high-throughput platforms to capture non-SNP genomic changes.

“People don’t want to necessarily just look at SNPs when they are doing genotyping,” notes White. “The analysis of SNPs is here to stay and, while it is a major driver of pharmacogenomics and all of genotyping, it is going to be supplemented. The more we learn about the structure of the genome, the more we are learning that it is not just SNPs that are important. There is increasing pressure on technology providers to develop applications that can be more flexible to cater to some of these applications.”

Cost and Throughput

With so many methods available to find and validate SNPs, researchers may find it challenging to pick the optimal genotyping method for his or her particular research. “The broad range of genotyping methods available now seem to all have comparable sensitivity and specificity,” says Deepika de Silva, Ph.D., program manager, genomic systems at Idaho Technology (www.idahotech.com). “Researchers seem to be evaluating throughput and cost as the primary criteria for choosing any given method.”

Idaho Technology offers a genotyping method that uses the double strand DNA binding dye LC Green in place of expensive fluorescent probes to provide a solution for the low-to-medium throughput market. “Genotyping methods that tie pharmacogenomics into clinical practice need to be simple, fast, and reliable. Low cost and fast turn-around should help the adoption of these tests into the clinical market, as will physician education pertaining to the value of the test,” says Dr. De Silva.

Another important parameter to be considered when choosing the optimal genotyping method is the size and scope of the study. “For most purposes, genotyping projects can be segregated into whole genome studies and candidate region/gene studies,” says Michael A. Monko, senior vp sales and marketing, Sequenom (www.sequenom.com). “In general, whole genome studies are conducted with a comparatively larger number of SNPs than candidate studies. Whereas the latter typically are run on a larger set of samples.”

Current genotyping technologies differ in their suitability for those two categories of studies. Microarray-based technologies dominate the whole genome scan field, while scalable platforms, such as Sequenom’s iPLEX™ gold Assay, are more appropriate for candidate or fine-mapping studies, says Monko.

“The process of translating pharmacogenetic research into clinical practice that benefits the patient and makes economic sense is as such very challenging. We are certainly only at the beginning of this process and do not know to what extent it will have an impact and when. For complex diseases, such as diabetes, hypertension, or cancer, we are dealing with multiple mutations whose individual impact is modified by both epigenetic phenomena and the environment. The challenge will be to develop clinically relevant tests that integrate all of those data efficiently,” predicts Monko.

Clinical Applications

Whether the information includes SNPs, indels, or copy number changes, ultimately these technologies are market-driven toward clinical applications to diagnose diseases or predict drug effectiveness. “I think this field has high potential moving forward, but at this stage, I don’t think this particular area is well-defined,” explains Jiang.

For some diseases, such as cystic fibrosis, the mutations are well-established. However, for other diseases, such as cardiovascular disease, diabetes, or certain cancers, the underlying genetics is much more complex, which magnifies the difficulty to translate SNP technologies into the clinic.

Another challenge is making sense of the SNP data. “Demonstrating clinical utility of SNP genotyping is going to be important,” says Dickinson, “At this time, it is not technology-limiting, it is discovery-limiting.”

Once assays reach the clinical stage, the new technology will have to be adopted by clinicians. “One of the key challenges is the education of clinicians in respect to pharmacogenetics,” explains Marcus Hausch, Ph.D., academic marketing manager for DNA at Affymetrix (www.affymetrix.com). “Historically, clinicians haven’t been trained to understand and use genotyping information in their decision-making. Additionally, in some disease areas, the link between genotype and phenotype needs to be better understood before it can be applied to clinical practice.”

After the initial hype, researchers are now testing the utility of SNP genotyping. “I think, more or less, everybody sees the realistic picture. It needs more time to become established,” notes Edgar Setzke from Qiagen (www.qiagen.com). “There will probably be certain applications and healthcare problems that cannot be solved with SNP genotyping.

“Maybe other technologies, such as gene expression, protein markers, metabolites, or other kinds of biomarkers, can provide data for better treatment,” Setzke adds. “The major breakthrough for the mass application of SNP genotyping that will find its way into every newspaper is still missing.”

Previous articleInvitrogen Sells Contract Services Business for $210M
Next articleRBM Awarded $750,000 from DoD