Mark Fischer-Colbrie Chief Executive Officer Labcyte

Reproducible Labs Results Yield More Drug Discovery and Faster Time-to-Market

The true end-goal of our collective work in healthcare-based life sciences is better outcomes for patients. At the heart of it, we need to significantly accelerate R&D cycle times and reduce discovery costs. Given the budgetary pressures on research spending, academia and Biopharma alike are struggling to find new models to address these issues, including how to get great ideas through the “valley of death.”

To design a better approach to healthcare, we need to rethink basic elements that are holding us back. That’s true throughout the drug discovery and development process, and starts at the earliest stages—in research labs, where some of the most important new findings occur. If we are to ultimately deliver cost-effective precision medicine, we must sharpen our tools and make  lab practices more robust. These operations will eventually feed into Big Data and AI pipelines, but if labs continue to generate irreproducible data, those sophisticated pipelines will only churn out useless, or worse, misdirected insights.

The generation of good data is limited by some instruments and reagents, as well as the lack of  training and standardization. Labs are working hard to do research with insufficient resources, but they often use practices that generate results that cannot be reproduced, are difficult to scale up, or are consistently wrong. Errors that would never be tolerated in other areas of science occur in NGS, DNA amplification, antimicrobial, and drug dosing experiments, for example, and can result in missed drug discovery, limited personalized medicine approaches, and incorrect diagnostics. As an industry, we must nail down our processes and improve the quality of data to produce the most reliable, clinically meaningful information.

Three Challenges

Labs face several obstacles to adopting a more industrial-scale mindset, but most of them fall into three categories.

Acceptance of the Status Quo
Time and budgetary pressures result in labs accepting shortcomings in a surprising number of procedures and technologies. Whether it’s relying on pipettes that dispense inconsistently or error-prone spreadsheets, expectations have been lowered or compromised based on the belief there is no better approach or that something is “good enough.” This view masks the underlying problems and forces us to overlook multiple sources of error that contribute to suboptimal experimental results. In a field that inherently relies on precision, why is this tolerated? Many tools and methods regularly used in labs are inadequate or have limitations, but without questioning them, we cannot truly understand the scale of the problems they cause in data quality and accuracy of results. In turn, the data generated can lead in the wrong direction.

Barriers to New Technologies and Lack of Standardization
As someone who has been on the commercial side of this industry for many years, I have learned it can be difficult to get new tools into labs to address the limitations of older platforms or to have labs adopt and train for standardization. Carving out enough time to investigate and evaluate new tools can be a significant undertaking that some feel they cannot afford. The problem is more difficult when inevitably some new technologies do not pan out.

Interrogating Complex Systems
While biological study traditionally took a more reductionist view, progress in drug discovery and diagnostics is requiring the embrace of more complexity for analyzing interactive biological systems. Ideally one must understand not only genetic changes for each disease type or patient, but also epigenetics, transcription factors, protein expression, and more. This approach requires expertise in multiple systems; inaccurate data generated from one of these analyses may have broad repercussions.

How It Can Work

Traditional liquid handling and sample preparation, whether done manually or with robots, are mainstays in drug discovery and other molecular biology labs, but are known for multiple sources of error. Chaotic dispensing, adherence of critical sample to the inside of a pipette, cross-contamination, robots that break down, and poor tip calibration are just some of the common errors.

At Labcyte, we took on a challenge to reimagine the liquid handling process, finding that sound waves could move fluids with unprecedented precision in billionth-of-a-liter increments and to completely bypass the myriad issues with pipettes. Without tips, nozzles, or valves, these acoustic droplet ejection (ADE)-based liquid handlers allow for touchless transfer between machine and samples, providing real control over liquids with precision and accuracy.

The benefits of re-tooling the liquid handling process extended beyond the researchers using the equipment to actual patients. One recent paper shows ADE’s improvement in the study of a potential therapeutic related to type 1 diabetes. Other studies report that combining genomic data with direct drug testing on an individual’s cancer cells with acoustic dispensing has led to patients entering remission, new drug candidates for leukemia and other cancers, and the unanticipated discovery that an existing renal cancer drug appears to be effective for a type of leukemia.

It is possible that much of the reproducibility crisis plaguing biological data right now stems from factors as simple as pipette tips. Think about that: how many billions of dollars  have been put at risk or misdirected because of poor liquid handling? If we had never questioned the basic pipette tip, liquid handling may never have been appreciated as a serious source of error in biological data. Solving the problem has allowed scientists to generate much better data and make significant progress in understanding complex systems.

What’s Next?

In an era of tight budgets and lean resources, it may seem impractical to call for reallocating time and money to the development and evaluation of tools to address issues that are not immediately recognized as problems. But this is precisely the time for such action. Without investing in our future, we will be stuck with the same experimental limitations, the same questionable data, and ultimately the same imperfect healthcare system. Acknowledging the shortcomings in existing lab workflows and technologies is a critical first step that will spur innovation and the adoption of new tools—tools that will ultimately reduce costs so we can accomplish more science and improve the quality of healthcare faster.

Mark Fischer-Colbrie ([email protected]) is CEO of Labcyte.

Further Reading
1. Rees S, Gribbon P, et al. Towards a hit for every target. Nat Rev Drug Discov. 2016 Jan;15(1):1-2. doi: 10.1038/nrd.2015.19. Epub 2015 Nov 20.
2. Begley CG, Buchan A, Dirnagl U. Robust research: Institutions must do their part for reproducibility. Nature. 2015 Sep 3;525(7567):25-7. doi: 10.1038/525025a.
3. Naylor J, Rossi A, Hornigold, DC. Acoustic Dispensing Preserves the Potency of Therapeutic Peptides throughout the Entire Drug Discovery Workflow. Journal of Laboratory Automation  2015 May; 21(1): 90-96. DOI: 10.1177/2211068215587915.

Previous articleAtypical Receptor Gives Up Structural Secrets yet Deepens Signaling Mystery
Next articleJanssen Taps PeptiDream for Peptide Discovery Tech in Deal Worth up to $1.15B