October 15, 2018 (Vol. 38, No. 18)

NGS is Attending to Technological, Regulatory, and Educational Issues

Almost 2500 years ago, Hippocrates captured one of the key principles underlying precision medicine—“It’s far more important to know what person the disease has than what disease the person has.”

In the 21st century, we take the understanding of the individual characteristics of a person to a new level. By leveraging information about an individual’s genome, we can increase the effectiveness of medical treatments. The goal is to have more successful outcomes by providing targeted therapies. This article looks at the past, present, and future of clinical analytics and how it benefits human health.

The foundation for clinical DNA analysis was laid with the completion of the Human Genome Project in April 2003. It resulted in a published version of all three billion base pairs of the human genome. Its completion marked the first phase of the clinical application of next-generation sequencing (NGS) data (Figure). In this phase, the community was primarily focused on understanding the correlation between genetic markers and diseases. As of today, researchers have identified over 1800 genes with disease associations. Typically, as individual variants in these genes are studied and observed, clinically relevant conclusions can be made that are embedded in the overall diagnosis and treatment of patients. There are many publicly available databases such as ClinVar that codify this knowledge and allow clinicians globally to leverage the current best genomic interpretations in their day-to-day work.

The cost of sequencing a genome has dramatically dropped since the first draft of the human genome sequence was published in 2001. This is a key enabler. Now, it is very much in reach for a wide range of patients to receive a gene panel analysis or even a whole-exome/genome analysis. However, there is more that is required. Through a collaborative effort between clinicians, pharma companies, scientists, and regulatory agencies, we are working on a new framework for genomics to be incorporated into standard care on a global basis.

Conservatively, we entered the phase of moderate adoption of NGS-based analytics in the clinic about 10 years ago. The goal is to diagnose diseases better, predict their outcome, and choose the best possible care option for a patient. Currently, we are in the very midst of this adoption phase (Figure).

Many success factors are key to driving adoption in the moderate phase of the curve:

•  Regulatory Environment. The FDA is paying increasing attention to how genetic information about individuals is handled. Already today, a growing number of drugs are approved with an accompanying test that leverages biomarkers.

•  Testing Technology. Many experts believe that the usage of NGS data, including whole exomes and genomes, will ultimately give us the insights needed to diagnose or predict diseases and select treatment options at the highest level possible. We see increasing usage of whole exomes and genomes in testing labs.

•  Reimbursement. The adoption of these tests in the clinic hinges on the ability of doctors to recover the expense associated with genetic tests. Payors are increasingly open to pay for NGS-based tests. Ultimately, for this industry, it is key that the tests become part of the routine way for clinicians to diagnose disease. Payors are looking for price points to come down so that a global adoption is financially feasible.

•  Physician Education and Acceptance. We are facing the need to educate a wide range of healthcare specialists involved in designing, conducting, interpreting, and utilizing genetic tests: translational researchers, pathologists, geneticists, genetic counselors, biostatisticians, etc.

•  Bioinformatics Capability. There is a plethora of new tests in the development stage that require a massive computing resource to deal with a sheer amount of data. The storage requirements for a whole genome, depending on the coverage, range between 100 and 200 GB for a single person. Added to that are the results of the variant analysis and other datasets generated as part of analysis. For a broader-scale usage, these datasets and their interpretation must be part of the patient record. This requires investment in an infrastructure to provide, gather, store, research, and clinically interpret this data on a large scale that is not currently in place.

•  Patient Demand. All tests require patient consent. Because of this, a more global adoption of genetic tests is also dependent on patients agreeing to use their DNA for this purpose.

The genetic testing technology and infrastructure are evolving quickly. There are a few areas where adoption is expected first with significant testing volume: oncology, rare disease diagnosis, and pediatrics and newborn screening. Beyond these areas, there is definite potential in areas such as obesity, diabetes, and cardiac disorders.

In addition to this, there is a considerable uptick in the adoption expected in the field of pharmacogenomics to determine safety, efficacy, and cost of care. New applications of genetic testing will result in changes to current care teams and processes. It will reshape how pathologists, oncologists, geneticists, genetic counselors, biostatisticians, and bioinformaticians work together.

Based on the current trajectories, it will take us well into the next 5 to 10 years to clear the hurdles characteristic of the moderate adoption phase. So, what are the top-level issues that need to be addressed once we reach a stage of full adoption? Here are some of the most pressing topics:

•  Automation. In 2017, there were 3,945,875 births just in the United States. Applying routine tests on every newborn will by itself require a massively scalable and highly automated infrastructure that surpasses everything we have established in hospitals and testing labs across the nation.

•  Population-Scale Sequencing. Our understanding of genetically caused diseases increases with the statistical power of the underlying dataset. Once we have a highly scalable infrastructure and testing operation installed on a nationwide basis, we are in a position to generate massive datasets that can be mined for undiscovered associations. This will also allow us to verify previous findings that were made using much smaller datasets.

•  Interdisciplinary Collaboration. The diagnosis and treatment of disease ultimately require that we can integrate data from multiple platforms such as imaging, biosensors, and predictive analytics—just to name a few.

For thousands of years, the smallest thing humans could detect was about as wide as a human hair. This changed with the invention of compound microscope toward the end of the 1500s. Over time, the microscope became a cornerstone of clinical diagnosis. It has been only a little over 40 years since the invention of DNA sequencing and a little over 15 years since the completion of the Human Henome Project. However, NGS-based clinical testing is about to quickly transform both biomedical research, as well as how we apply this knowledge in the clinic.


Figure. Next-generation sequencing (NGS) is moving along the adoption curve from early to high adoption. Key to this progression will be several success factors.

Andreas Scherer, Ph.D., is president and CEO of Golden Helix.

Previous articleMouse Pups Have Two (Biological) Mommies
Next articleFat Tissue Found to Be a Surprising Source of Diabetes