By James Pena
Life sciences and healthcare industries are approaching digital maturity relatively slowly. According to a Deloitte study issued in 2018, just 13% of the companies in these industries ranked among “higher maturity” organizations. (For technology/media/telecommunications, financial services, and energy, higher maturity was achieved by 34%, 30%, and 27% of companies, respectively.)
One key technology helping laboratories to achieve digital maturity is the cloud environment. With modern cloud-based technologies, data storage, access, and analysis are more streamlined and secure, and data management is more flexible and scalable.
So, why are many life sciences and healthcare organizations digital laggards? First, knowledge of the cloud can be patchy and vague. Also, marketing or technical language can mystify what the cloud can do. Finally, the ever-increasing complexity of scientific data can overwhelm organizations struggling with data management decisions.
Major values of the cloud
There are aspects of modern data management that are available only via the cloud environment. For example, cloud systems enable users to collaborate on huge datasets, including genome sequence datasets, in real time. By creating a third-party space, users can pick and choose which data to share, keeping more sensitive data secure and private.
Another benefit is that users can aggregate diverse data sources. With cloud systems, repositories can be accessed from scientists around the globe or across different sites. For example, scientists can access data from different departments or organizations as a drug moves through the development pipeline and into manufacturing.
Cloud services also provide increasing levels of automation. This can take many forms and includes users being able to utilize the power of modern computing. For example, cloud microservices can automate selection and storage of DNA sequence fragments, while other microservices can analyze those sequences and automate data generation. These capabilities relieve users of manual tasks so that they can focus on other activities.
The benefits of cloud environments extend beyond the practical details of data management. Benefits that may be realized by the organization as a whole can be summarized in three core concepts:
Reduced IT costs: With older, on-premises storage solutions, hardware and software must be bought and managed onsite, which are expensive and onerous tasks. Managing and monitoring the newly upgraded facilities and acquiring the necessary IT expertise can be overwhelming, especially if one already struggles to keep up to date with technological changes. With cloud services, much of the hardware (how much depends on the customer) is managed off site by the cloud provider, meaning the customer can leverage up-to-date technologies without needing to buy or manage them.
Scalability: As a laboratory or a business grows, it may soon become apparent that hardware such as data servers are incapable of keeping up with increasing data demands. However, with cloud services, the cloud can scale with the size of the business and adapt to the new types of science the laboratory does over time. Computational requirements can increase as the scope of the laboratory does, too.
Flexibility: The cloud is highly flexible and adaptable to the needs of laboratories and businesses. Users can, for example, connect on-premise systems to the cloud, connect with collaborators in different countries, or access services from remote locations while in the field. Depending on where an organization is in its digital journey, it can choose to access any number of digital services to suit its specific needs.
Historically, on-premises cloud infrastructures were built from the ground up. This is a standard intranet infrastructure requiring IT expertise. But in the last 20 years, there has been an increase in public cloud services (through providers such as Google Cloud), which are larger solutions that connect different customers to the same central data storage system to increase efficiencies and drive down costs. However, public storage systems don’t work for all businesses, especially those that require compliance with strict data security and privacy, such as when dealing with intellectual property. Public cloud providers therefore created virtual private cloud storage to logically and/or physically segregate data storage hardware dedicated for individual customers. This means that customers can have the security of private cloud storage nested in the accessibility of a public cloud.
Cloud service types
Different cloud service types accommodate different levels of access and control. The “bare metal” service type uses only on-premises hardware. The infrastructure as a service (IaaS) type uses bundled hardware configurations, which cloud service providers define based on customer needs. The platform as a service (PaaS) type adds pre-loaded operating systems or applications. Finally, there is the software as a service (SaaS) type. It offers the most flexibility and scalability.
There are two SaaS subtypes—managed SaaS and “true” SaaS. With managed SaaS, the provider offers additional support by taking care of networking and hardware requirements, but the cloud is overseen by the customer. This offers higher levels of control over configurations, security, and data encryption, meaning it’s ideal for customers who must adhere to tight data regulations, such as those in GxP environments. “True” SaaS systems are more typically multi-tenant cloud systems.
Which cloud provider to choose?
One general consideration is whether the provider has expertise with the desired service type or subtype. For example, if the managed SaaS subtype is desired, the provider should be able to assume deployment and upgrade responsibilities on behalf of the customer. This arrangement enables fast deployment across a business and full maintenance support. The customer decides which upgrade and software management tasks are needed, but the tasks themselves are performed by the provider.
Another general consideration is the availability of helpful customer service teams. Such teams may be especially important for global deployments and other complex implementations.
More specific considerations correspond to unique laboratory or business needs. Such considerations reflect three core concepts:
Foundation (control and compliance): When it comes to data security, control and compliance are often subject to legal, regulatory, or institutional requirements. These requirements may impact your cloud deployment and service type possibilities.
Interaction (access and capacity): It’s also crucial to think about the types of access points (on site or remote), the number of access points, and how dynamic they need to be. For capacity, you need to consider speed versus cost. Do you need instant access to data to perform highly critical decision making? Do you intend to use microservices that require great computational power to analyze complex datasets? Every option has an associated premium.
Futureproofing (long-term strategy and business goals): It’s also good to prepare a cloud system that can keep up with laboratory/business growth. For example, having cloud infrastructures ready to deal with mergers or collaborations ahead of time can make these transitions seamless.
The cloud is more than just data storage
Cloud technologies are changing the way science is being performed. With the cloud, scientists can push innovation and accelerate their research through a huge range of different services and systems. Modern cloud providers are tailoring their services to the exact needs of customers and removing barriers to the desired levels of data security and access. Moreover, cloud technologies are cost efficient, and they can adapt to the growth and needs of the laboratory/business. Ultimately, instigating cloud-based data management systems can empower organizations and give them the tools they need for the future of data-led science.
James Pena is product manager, digital science, Thermo Fisher Scientific.