September 15, 2012 (Vol. 32, No. 16)
The quest for personalized medicine adds to the complexity of today’s scientific questions. Drug discovery efforts require access to larger arrays of biosamples, and demand is exceeding supply, fueling the growth of the biobanking market.
“Humans are inherently variable, which is good for evolution but bad for drug discovery. More samples equal more reliable results. However, cost and ease of access tend to limit the number of samples used,” explained Paul Whittaker, Ph.D., director and unit head for preclinical biomarkers, respiratory disease area, Novartis.
Experts met at SMi Group’s recent biobanking conference to discuss the challenges the industry faces, from supply issues, standards development, and sample tracking across multiple organizations to potential uses for banked clinical trial samples.
Sources for human material are increasing, ranging from those that provide postmortem material, such as the International Institute for the Advancement of Medicine (IIAM), to consortia that provide samples from living donors with specific diseases, such as the Lung Tissue Research Consortium. Depending on source, varying amounts of donor information are supplied with the human materials.
The Medical Research Council and Association of the British Pharmaceutical Industry COPD consortium aims to identify more homogeneous groups of patients using extensive and standardized clinical and physiological testing.
“The term ‘well-phenotyped’ could mean an exhaustive list of clinical, physiological, and molecular parameters that would maximize use of human tissue for a range of experimental uses, or it could be a more restrictive set of parameters that are highly relevant to the particular hypothesis being tested,” continued Dr. Whittaker.
“For some lung diseases, like interstitial pulmonary fibrosis and chronic obstructive pulmonary disease, there is good access to high-quality, well-annotated material, and for others, such as pulmonary arterial hypertension, academic collaborations are the main way to access material.
“It is clear that no one biobank can meet all the needs for biomedical research and that biobank networks will be necessary to make collections more widely available.”
An exciting development is the emergence of virtual biobanking, where companies provide a single point of access to a range of biospecimens. Using networks of ethical sources, they find the tissues required to the specifications provided by the requester, and then deliver them to the requester. This streamlines the procurement of tissue and benefits all parties.
Maximizing Biospecimen Use
Biospecimens are most frequently used in drug discovery research. To answer questions about drug mechanisms in the patient, biospecimens obtained during clinical trials are invaluable.
Clinical trial protocols specifically state how collected samples will be used and how long they will be retained. Depending on consent parameters, biospecimens collected during a clinical trial are often discarded after the study report is finalized.
“Most companies are sampling more, increasing and building biorepositories,” commented Mads Roepke, Ph.D., senior scientific advisor, clinical pharmacology, LEO Pharma. “The discussion is what to use the samples for. Some companies impose strict internal barriers for use of samples, and bank samples only to use if safety or efficacy issues arise. Others are more collaborative across divisions, and use them for drug discovery and development in additional cross studies.
“Use of samples across studies requires special approval by the regulatory committees and consent from participants. When you are outside the parameters of the initial study protocol and wish to store patient samples for later studies with a new scope, you move into biobanking and the associated regulations, including questions of sample analysis and reporting.
“At LEO Pharma, we are currently looking at using biospecimens obtained from patients in one program for investigations in other programs where drug candidates with similar mechanisms of action and/or the same patient types are studied. Clinical data associated with the sample would be more specific to the disease. By maximizing the use of patient samples, we could increase our understanding of disease biology, and advance research on future therapeutics for the benefit of patients.”
Implementing End-to-End Processes
“Good quality assurance in biobanking is an end-to-end process,” remarked George Tokiwa, Ph.D., associate principal scientist, Merck Research Laboratories. “In a pharmaceutical company, you need to be sure that you plan, collect, process, track, and store future-use specimens in a manner that maximizes specimen utility, and minimizes risk to specimen integrity.”
Quality-management processes, documentation and the physical attributes of the specimen are all crucial. Specimen collection requires compliant, robust methodology and validated processes that safeguard specimen integrity. Good IT infrastructure, a robust comprehensive inventory system, specimen oversight/governance, and project management ensure chain of custody.
Merck utilizes processes to track specimen permissions, consent reconciliation, and other possible restrictions or attributes of the specimens, e.g., restricted to disease-specific use. Consent expiration dates are tracked, and specimens flagged and destroyed as required. In addition, since certain specimens or specimen analytes may be stable only for a finite period of time, these specimens are tracked and flagged as suspect when appropriate.
During the course of each clinical study, performance metrics are used for trend analysis to determine if intervention or remediation action is necessary. An example is future-use genetic specimen collection rates and DNA quality. Each study site’s performance is monitored and compared to expected collection rates, DNA yields, and quality.
“Biobanking specimens for long-term storage and future use can be an expensive proposition. We want to ensure maintenance of material that is of value,” continued Dr. Tokiwa. “We developed a specimen retention algorithm, a set of decision trees to assess the biorepositories’ inventories and determine how best to disposition the specimens. With appropriate stakeholders’ feedback we are able to objectively decide if specimens should be retained or discarded.”
Merck Research Laboratories recently signed a partnership agreement with the National Center for Cardiovascular Diseases Fuwai Hospital for collaboration on the development of China’s first biobank for cardiovascular and metabolic biosamples.
Tracking Samples in Virtual Organizations
“Often precompetitive research, focused on disease understanding, uses a consortium, a virtual organization,” explained Anthony Rowe, Ph.D., principal research scientist, external innovation R&D IT, Janssen. “Data-collection and results-management technologies must be diffused across all of the organizations.”
There are currently several hundred ongoing collaborative studies worldwide and little existing technology that can be deployed across multiple institutions to track samples. Typically, the classical LIMS biobanking software is tied to a specific enterprise and does not support consortium-based research.
The quality management of a collaborative scientific research environment is challenging enough without highly ineffective sample-tracking methods intervening. Different institutions are tackling different parts of the scientific problem, and samples can be at a variety of places. Without global visibility of the entire consortium, identification of quality problems and bottlenecks is difficult.
“Biosamples are critical elements of precompetitive research,” said Dr. Rowe. “Sample management should be viewed similar to supply chain management. Consortia need infrastructure to manage samples across the virtual enterprise just like they need a research database to collate all of their scientific results. LIMS work fantastically well within a single enterprise, but you want to provide sample tracking across multiple enterprises.
“If there are systematic problems, you need to identify and rectify them, and monitor samples as they go through the study network. Up-front sample issues affect downstream data analysis. The entire project can be jeopardized by small differences in how samples are handled and processed. We use barcodes to track sample movement along with a paper trail. This gives us the capability to track even if the samples are sent to a third group,” he continued.
“Transmart, an open-source translational data warehouse, developed as a Janssen in-house solution, is often deployed as the results’ data warehouse. We realized early on that if we made Transmart open source we could enable better-quality science. There would be a single research database. Funding would go to the sciences, and not IT,” concluded Dr. Rowe.
Developing Biobank Networks
In Europe, the BBMRI (Biobanking and Biomolecular Resources Research Infrastructure) was funded as a preparatory project to develop plans for a pan-European network of biobanks.
“A network of biobanks is just one component of a biobanking network. Future research needs require distinct networks to work together. A biobanking network is a network of networks,” discussed Martin Yuille, Ph.D., joint director, Centre for Integrated Genomic Medical Research, University of Manchester.
“These networks include experts in managing all the varied data types—clinical, environmental, experimental, and so on. Then there are groups of public and private funders, technology providers, and governance experts. All this expertise is essential to a biobanking network.
“It is complicated. Existing medical information classification systems, such as HL7 and SNOMED, are useful and allow flexibility, but not everyone follows them. In addition, samples are processed in varied ways using different instrumentation. Standards have to be consensual between all stakeholders, and those stakeholders need to include instrument manufacturers.
“Networking processes—sharing standards and allowing each network to focus on its strengths, and coordinating all this—will produce the biobanking network needed for future competitive research and cost efficiency.”
An example is the University of Manchester hub-and-spoke pilot project. Accrual and initial sample stabilization takes place in the spokes, such as hospitals, while the final processing, storage, and retrieval is undertaken centrally in the hub. This organizational set-up provides the opportunity for large-scale sample management and allows division of labor and role differentiation. Each stakeholder does what they do best.
“Running a biobank requires high-quality, consistent sample management, cost effectiveness, and associated data management. Running a biobanking network requires a new combination of skills—lab methods research, informatics, and organizational research. There is no Big Bang solution to move from the current fragmented landscape toward a coordinated one.
“Change management is essential to remove barriers. The biomedical research community needs to reach out to experts in the humanities for help, to political scientists, sociologists, and economists, and we need to engage the public in research around their health,” concluded Dr. Yuille.
Creating a Successful Biobanking Operation
Last month, Tim Sheehy, director, global genomics, Promega, presented a webinar on “Biobanking in 2012: Careful Selection of the Proper Tools to Increase Productivity.”
Sheehy pointed out that every biobank has a different set of customers and downstream applications for which it must supply samples. He added that there are always multiple solutions to address needs around productivity.
“A major challenge to making any kind of improvement is to change the mindset that biobanks are just freezers, refrigerators, and file cabinets into one that highlights the translational importance of the samples being managed,” he said. “Continuing to strive for quality samples is key to both productivity and building lab services.”
Sheehy outlined a few first steps to consider to strengthen a biobank operation. First, carefully examine the entire existing process. To improve productivity, one needs to really know the existing process from cradle to grave. Specifically, determine how samples are:
- Delivered (formatted)
- Processed (workflow)
- Requested by the end user (format and frequency)
“A thoughtful review will help you prioritize what can be done based on ease and cost,” he explained.
Second, examine quality standards further upstream on sample handling and prequalification. Ensuring quality samples is challenging at best if there are little to no standards set for sample collection and submission. Consider encouraging the six sigma approach for prequalification of samples as part of the sample-handling workflow.
“Such practices can reduce costs, centralize controls for assuring quality and improving process metrics, and manage expectations of real-time sample availability and fit-for-intended-use annotations,” said Sheehy.
Take a Stepped Approach
Careful examination of existing processes and setting expectations on how improvements will impact your lab is critical. Then, some initial steps for improving productivity should be explored:
- Include prequalification assays focused on providing an assessment of DNA quality and fitness for intended use into upstream process or workflow tasks.
- Centralize extractions, quantitations and prequalifications assays into a single workflow that allows for true process control within a standalone sample-handling laboratory.
- Provide platform-specific run-ready plates for all downstream applications regardless of the processed sample type, i.e., DNA, RNA, or protein. This enables the biobank to optimize the effective use of the samples it holds while reducing or eliminating any additional manipulation of a sample by the more expensive downstream analysis laboratories, i.e., genotyping or sequencing core labs.
- Automate those process steps that can deliver the largest return on investment for the automation used. This allows for an opportunity for lab staff to learn those tasks that are more difficult to automate. The result could be additional service offerings that the biobank can offer its customers without an increase in head count. Tie the samples’ data flow through the workflow to provide real-time sample annotation, better controls over the various quality checks and the ability to report operational and financial metrics on the realized improvements.