Genomic data is pouring out of academic and government research labs, flowing into rapidly expanding databases—often overwhelming storage and informatics capabilities—and fueling a steady stream of publications proposing links between the “needles” being discovered in genomic haystacks and disorders ranging from common to uncommon, from cancer to Mendelian disease.
The researchers in hot pursuit of these discoveries are benefiting from rapid technological advances as well as new instruments and algorithms designed to sequence whole genomes faster, cheaper, and more accurately, to analyze gene expression, and to identify SNPs on ever-expanding microarrays to single out variations that may distinguish “healthy” from “disease” or “at risk” genomes.
The promise riding on early efforts to apply these discoveries to real-world clinical scenarios and diagnostic, prognostic, and therapeutic decision making is tempered by the complexity of the genomic landscape, the enormity of the data output, and the challenges in interpreting raw genomic data to yield clinically useful information and to span the gap between the laboratory and the practice of genomic medicine.
At the recent American Society for Human Genetics (ASHG) meeting held in Washington, D.C., a session entitled “Genomic Medicine: Current Status, Evidence Dilemmas, and Translation into Clinical Practice” presented a real-world view of the opportunities and obstacles in acquiring and applying the data from whole-genome sequencing (WGS) and genome-wide analysis (GWA) studies.
Although the technology is available to sequence an individual’s genome, it is still a relatively costly endeavor, and how best to analyze and interpret the clinical significance of the results is not yet clear. Session moderator Kelly Ormond, associate professor and director of the master’s program in human genetics and genetic counseling at Stanford University, described the tension that exists between two competing forces driving genomic medicine: the desire to accelerate early adoption of the technology in the clinic and the cautionary voices that question the clinical utility of the information.
In her overview of WGS technology, Debbie Nickerson, Ph.D., a professor in the department of genome sciences at the University of Washington, emphasized the impact that next-generation sequencing (NGS) and emerging third-generation single-molecule sequencing technology will have on advancing knowledge about human genome variation and the ability to link genetic and phenotypic variation.
The results of large-scale GWA studies are increasingly populating leading scientific journals and facilitating family-based linkage analysis and disease association studies, she noted. The speed and cost of these studies will continue to improve as high-density chips containing one million variant markers will soon give way to arrays with as many as five million SNPs.
Dr. Nickerson described the new, higher-throughput, lower-cost genome sequencing strategies as “disruptive, game-changing” technology. Examples include systems developed by 454 Life Sciences (a Roche company), Illumina’s HiSeq sequencing by synthesis technology, the Applied Biosystems/Life Technologies SOLiD system, the Complete Genomics CGA platform, Pacific Biosciences’ Single Molecule Real-Time (SMRT) sequencing technology, “The Chip is the Machine” semiconductor chip-based system from Ion Torrent (recently acquired by Life Technologies), and Oxford NanoPore Technologies’ nanopore-based sequencing strategy.