Documents are containers of data that are constrained by their creators with certain levels of context and interpretation. Previous research by IDBS has shown that approximately 25% of a researcher’s time is spent writing such reports—compressing information for communication.
In a linear process where the requirements for the consumption of data are well understood and unchanging, this may be sensible. However, in a complex environment where different consumers require different data it is not.
The research shows that 60% of researchers have to wade through multiple documents to extract the data they need to start their work. Even when this is done, researchers may often have to spend time in unstructured Q&A sessions with those groups who generated the underlying data in order to challenge and obtain what they need.
Thomas Goetz, executive editor of Wired and author of The Decision Tree: Taking Control of Your Health in the New Era of Personalized Medicine, reminded us that “We can profoundly change our behavior once we are provided with the relevant data.” Our survey showed that 91% of researchers could not align data from internal or external collaborators effectively.
Evolutionary history is metered by a number of transformational events that appear to change the course of development. These extinction events are driven by environmental pressures and the inability of much of an ecosystem to “fit” the new reality.
Environmental pressures affect any ecosystem, and today’s environment, particularly in pharmaceutical sciences, is highly challenging. Andrew Witty, CEO of GlaxoSmithKline, put it this way: “The blockbuster business model clearly worked—and up until the time of the human genome breakthroughs, most would have expected this trend to continue. It has not. So we are having to reinvent our industry.”
R&D must become more productive and innovative—in fact more innovative than ever—to reverse the current trend of cost-per-marketed pharmaceutical, now estimated at almost $4 billion. Leaving aside the undoubted physical challenges of patent cliffs and regulatory fences, the productivity gap has driven the externalization of R&D.
Chris Viehbacher, Sanofi’s CEO, said the firm has realized that “major groups are not great sources of innovation.” Externalization allows companies to tap into smaller, flexible organizations and global talent but has the effect of making the extended organization significantly more complex.
The ecosystem though remains very similar: groups consume data, groups produce data. The complexity is only noticeable if the data is not available to the consumer.
Each data generator should be able to seamlessly add to this landscape of data and each can draw from it in the way they need, personalizing their data feed so that it remains consumable and relevant. Reliance upon “Death Star” warehouses—which must be built with knowledge of all the questions that will be asked of them—are the dinosaurs of our modern, personalized IT age. Is the cloud the new ecosystem? It will play a vital role in providing extensible computing but requires structure, application, and security.
There remains a lack of understanding about the various ways in which the cloud environment affects existing systems. An increasingly common approach to survival in this atmosphere is “cloud-claiming”: to simply provide a hosted version of an existing client/server application and brand it as a cloud-based or SaaS solution. This hosting-equals-cloud approach is not SaaS but mainly leverages Infrastructure-as-a-Service (IaaS).
As Seymour Dunker, CEO of iCharts, says, “Everyone acknowledges that data is exploding, but no one seems to have a handle on finding relevant data and making meaning from it. The ecosystem of data is still in a very infant stage. The big gap is a data publishing and distribution platform that makes it simple to take the data from the source to where it can be utilized most effectively.”