September 1, 2011 (Vol. 31, No. 15)
Commercial Potential of Technology Demands that Industry and Investors Alike Stay Up to Date
At Scientia we do a great deal of thinking about the evolution of sequencing technology, its commercial potential, and the strategic implications for stakeholders. In this article we outline, define, and describe some of the more forward looking trends we are following, including sequencing activity polarization, workflow value shifts, and clinical and commercial applications.
The first trend to watch is sequencing activity polarization, which we define as an increasing share of total high-throughput sequencing activity being conducted by the largest and smallest sequencing labs. It is driven by a series of centralization and decentralization forces that are acting on the space simultaneously.
The most important centralization force is the emergence of research consortia seeking to sequence and analyze large numbers of human and other large (meta) genomes with the goal of discovering new variants and exploring their significance to human health and other matters of economic and/or scientific importance.
It turns out that this trend drives centralization because there are a number of economies of scale associated with analyzing a lot of genomes. The cost of sample logistics and tracking, sample preparation, the sequencing itself, IT infrastructure, and analysis tends to fall as the number of samples being analyzed by a given center expands.
Scientists who outsource their large sequencing projects to larger labs may benefit from these labs’ lower cost basis as well as a reduction in technology acquisition and obsolescence risk. Furthermore, a service provider may be able to provide more standardized and reproducible results.
This centralization process reminds us somewhat of the history of the DNA synthesizer market. In the ’90s many labs synthesized their own DNA, but over time end users have increasingly turned to commercial custom oligo suppliers that are able to provide lower costs and better quality.
The forces resulting in the decentralization of sequencing activity are more numerous and complicated. The most discussed force is the availability of desktop-format, high-throughput sequencers that have significantly lower price points than their high-end peers.
This emerging product category has the potential to displace CE technology in some application areas given greater scalability in terms of experiment size and faster turnaround times. More important but somewhat less discussed are the user needs driving demand for such systems. In research, these applications include microbial sequencing projects. In the clinic, these applications include panels of tests for oncology, genetic disease, and infectious diseases.
The second major trend to watch is workflow value shifts, which can be defined as a period of more pronounced decommoditization and commoditization of resources used in the high-throughput sequencing workflow beyond the sequencing instrumentation and consumables themselves. The key drivers of this trend are primarily technical in nature and have to do with the maturation of second-generation technology and the introduction of later-generation sequencers.
We view technical improvements to second-generation sequencers as the critical force behind an increase in the relative value (i.e., decommoditization) of other instruments, consumables, software, IT infrastructure, and expertise used in the overall workflow.
At the highest level, our contacts in major genome centers have shifted their spending allocation from ~65% on tools (instruments, consumables) in 2005 to ~40–50% on tools in 2010. Further decreases in the fraction of genome center budgets being spent on tools are expected to be driven by increasing investment emphasis on IT/informatics infrastructure, sample logistics infrastructure, and additional personnel such as bioinformatists and pathologists.
Drilling down into the tools portion of spending, we see increasing emphasis on sample-preparation instrumentation and consumables, DNA shearing and size-selection instrumentation and consumables, bisulfate sequencing kits, chromatin immunoprecipitation (ChIP) kits, sequence-enrichment instruments and consumables, and bioinformatics tools over the next few years.
An opposite but somewhat longer term force is the introduction of third- and later-generation sequencing technologies, which may very well make large swaths of the workflow obsolete. This is the case because single-molecule, real-time technologies are expected to avoid amplification steps, thereby requiring less sophisticated sample prep, have longer read lengths that may do away with the use of expensive BAC libraries used in de novo sequencing workflows, have a low cost per Gb, thereby removing the need for sequence enrichment, and may be capable of directly detecting methylated and other modified bases, thereby doing away with bisulfate sequencing kits.
By replacing these tools with higher-end sequencing equipment, we see the potential for later-generation sequencing technology to be priced with a workflow-consolidation premium.
Diffusion of Technology
A trend of significant interest within the life science community is the diffusion of sequencing technology into clinical and commercial applications. Whereas the other trends we have discussed are characterized by a number of forces working at odds, the two major forces driving this trend—namely increasing demand for higher performance and a rapidly expanding technical performance envelope—are both working together to enhance adoption outside of research.
While most sequencing discussions focus on the technology’s performance-improvement trajectory, we believe that the evolution of customer needs is both more important and changing fast enough to be of considerable interest. This demand is driven by a desire to decrease false negatives and false positives from an analytical perspective and to increase positive and negative predictive value from a clinical perspective.
Performance dimensions that clinical and commercial customers are seeing as more important for limiting false results and increasing predictive value include the plexity of the analytical modality (how many markers are measured in a given test), the sensitivity of the analytical modality (the limit of detection or the ability to detect mutated cells in a background of wild type), and the quantification capability of the analytical modality (the ability to count the number of times a particular analyte is present).
For example, we have observed an increasing number of markers used to inform the decision of whether a colorectal cancer patient should receive EGFR antagonist drugs such as Erbitux (cetuximab) and Vectibix (panitumumab). While the label of Erbitux only goes so far as to say “trials have not shown a treatment benefit for…patients whose tumors had KRAS mutations in codon 12 or 13,” many other markers with negative predictive value have been identified and are routinely used to determine whether or not to use these drugs.
For example, market-leading clinical laboratories offer and routinely perform fairly comprehensive EGFR pathway analysis of additional KRAS mutations as well as mutations in the genes BRAF, NRAS, PIC3K, and others. While these pathway analyses are often carried out as a series of one-off tests (rather than higher plexity panels of tests) and often accomplished using capillary electrophoretic sequencing (rather than methods with higher analytical sensitivity and quantification capability) we believe that as more markers are discovered and as the community becomes more interested in the relative abundance of important mutations, demand for the unique technical capabilities of high-throughput sequencing technology will be created in clinical contexts.
We currently estimate that at least 20% of this application area will migrate to high-throughput sequencing platforms by ~2014 with accelerating penetration thereafter. Furthermore, we see a multibillion dollar opportunity for the technology and associated services associated with a genome-first cancer-care paradigm (beginning in areas such as triple negative breast cancer) likely to become significant from 2014–2020.
We call the most long-term and potentially most significant trend in sequencing the omnipotence of sequencing technology. We define this trend as the gradual diffusion of high-throughput sequencing technology outside of the realm of genomics and into the analysis of other analytes including small molecules, proteins and peptides, and intracellular phenomena across research, clinical, and commercial applications.
This potential is driven by the fact that high-throughput sequencing technology has become the most simultaneously sensitive and massively parallel analytical instrumentation yet invented. While genomics applications, as a large market involving the analysis of relatively simple analytes, represents the low-hanging fruit for the technology, we observe and foresee a generation of pioneers in molecular biology building a steady stream of additional applications.
Given the size of the general analytical instrumentation market (about $40 billion) and that of the regulated, commercial in vitro diagnostics market (similarly at about $41 billion), the implications for existing market participants and customers are considerable. If a few racks of sequencing instrumentation can replace an entire clinical lab worth of clinical chemistry, immunoassays, microbiology, and molecular diagnostics equipment, industry heavyweights could ultimately be at risk.
Harry Glorikian ([email protected]) is managing partner and Brian Clancy is an associate with Scientia Advisors.