|Send to printer »|
Insight & Intelligence : Feb 25, 2014
Updating R&D for an Increasingly B2B World
Externalization is the new normal. Find out what you need to know to keep up.!--h2>
Today’s research and development (R&D) environments have seen significant technological advances over the last ten years. These technologies have led to more data being generated and continue to challenge scientists in terms of how they manage the deluge of information available. This “old but growing” challenge is compounded by other industry pressures, mostly cost based, that have led to the growth in externalization. The effects of this are the creation of more B2B relationships in discovery research upstream of the well-established preclinical and clinical areas. These upstream discovery activities are far less “prescribed” by their very nature. So how can organizations deliver results and share discovery research information effectively across the global networks that are evolving?
A More Productive Route to Drug Discovery
Scientists need to be able to draw conclusions and make decisions based on context-rich data quickly and confidently. After all, a new series of experiments is often designed and carried out based upon the results of the previous experiment. The body of scientific data called upon to make those decisions comes from anywhere in the world. It can be through a peer-reviewed scientific paper, a colleague or, increasingly, by a scientist who works in another part of the world for a different commercial organization.
R&D is based upon a highly iterative scientific method. It’s logical that reducing the time taken at each stage means getting more done. More decisions are made in any given time period. “Lean” initiatives spring to mind. The Japanese idea, which aims to reduce waste (or “muda”) was pioneered in automotive manufacturing. It has been adapted to industries such as pharma to boost efficiencies in the lab.
It’s not specifically about “good” or “bad” decisions/conclusions. In scientific terms, both are valuable. In commercial science settings, however, lab managers strive to reduce the latter; wasting time and effort on a problem that is not tractable doesn’t increase productivity. This logic scales well within a single, geographically co-located business function. It fares less well in globally distributed business functions that include a number of distinct and different organizations.
This scenario is at the heart of the future business model for pharma as the industry expands further upstream into discovery research. Externalization is based on the commercial need to reduce the overall cost base of drug discovery. As a service approach emerges, the cycle time for decision-making is again brought to the fore. That is where the three “-ilities” come in—key elements that organizations must perfect to meet this new normal.
In this B2B environment, success is measured by achieving the same results at a lower operating cost. The ability for a service provider to deliver their packaged output in the form of scientific data and results to the customer is a must. Ideally, this package of information should encode metadata relating to the experimental method and also detail analysis algorithms or calculations.
Today, automation of this portability and acknowledgement of receipt can trigger payment processes as well. In some ways portability can be as simple as an email attachment. Better still, extract, transform, and load (ETL) processes or similar mechanisms can automate this portability between data management systems and the world-wide networks. So, what should be in our globetrotting information package?
Linked very closely to portability is “incorporability” (which is not currently listed in the English dictionary, but it should be). Definition: the ability to incorporate data.
What use is the fast delivery of a detailed package of information if it can’t be immediately incorporated into the pre-existing body of data and information of the customer? Decision-makers must be able to interpret new information quickly, in the context of what they already know, to ensure productivity and appropriate cycle-time. These packages of information must be in a format that lends itself to immediate incorporation.
There are systems and formats that enable this rapid incorporation today. Manual transcription and data entry, which adds “muda” to what should be a streamlined process, still remain in the mix. This approach also adds transcription error risks. So, in the same way that plants have evolved efficient mechanisms to convert sunlight to useable energy, R&D must advance its systems and processes to enable scientific data to be illuminated and made actionable.
This final “ility” is also critical as scientists rely on the ability to compare results from different organizations. Issues surrounding reproducibility have reached unprecedented heights, as a significant rise of peer reviewed scientific paper retractions takes hold (New York Times, April 16, 2012). In addition, a number of industry studies and reports have cited an unacceptably low reproducibility rate in scientific publications (Nature Chemical Biology 9, 345, 2013). It’s no wonder that www.reproducibilityinitiative.org now exists to provide authenticity to scientific publications.
At a more granular level, it is easy to see why results are difficult to reproduce. Results used during lead optimization programs typical in the industry are compared in many cases using a single, or maybe a few, numeric results from individual biological assays. For example, an IC50 (the concentration of a test compound that exhibits a 50% inhibition of the biological process being measured) is calculated using a four-parameter logistic fit model. Whilst the IC50 is the key measure, it should be taken in context with the other three parameters calculated (slope, max asymptote, min asymptote). Then add the choice of more than one fit model and allow for human intervention. The scientist may indeed remove a number or two from the original dataset in order to improve the quality of the result. With even these seemingly minor variables in data analysis, differences in results will be seen and may or may not affect subsequent decisions.
Publishing (internally or externally) these four parameters on their own, without important details such as the binding Michaelis constant, or the concentration of substrate or ligand used whilst performing the assay, negates the value and comparability of the results. The specifics of every calculation should be encoded within a portable and incorporable entity that can be passed between collaborators or organizations. Better still, the full scientific method should be encapsulated with this package of information.
The Right Results
By combining these three “ilites” organizations can expect actionable scientific results, wherever they are originated, in the right place in the right format. Data will also be comparable with information and other results from elsewhere. Downstream, the benefits are faster with better decisions and confidence in the data. Should a scientist ever need to come back to the results, five days or five years later, the entire method of calculation is stored there with it, showing any data manipulated during the workflow. This adds value and maintains the highest possible standards in scientific data interpretation.
© 2013 Genetic Engineering & Biotechnology News, All Rights Reserved