|Send to printer »|
Feature Articles : Sep 15, 2013 ( )
Designing and Managing Modern Biobanks
The last decade has witnessed the rapid growth and expansion of biobanking. In particular, researchers in genomics and personalized medicine are increasingly reliant on well-designed biobanks since annotated samples from patients with known histories are critical for correlating the biology of a disease with clinical data.
GEN recently interviewed several experts who will present at CHI’s “Leaders in Biobanking” conference in November. The focus of the meeting will be on optimal methods to design and manage modern biobanks.
According to Andrew M. Dahlem, Ph.D., vp, Lilly research lab operations Europe, Eli Lilly, the company uses two fundamental biobanking strategies: nondirected and directed screening.
“In nondirected strategies we collect and bank samples and collect information for future analysis by scientists who screen the collection for predetermined subgroups of people,” explains Dr. Dahlem. “This approach gives individual researchers access to genetic and biological data that may be more valuable as learning curves grow and specific questions are asked from a future perspective.”
The second method is directed screening. “This represents the most promising application of biobanking because it refers to collecting samples from patients with known diseases and requires linking knowledge of nonconfidential information in patient records to targeted hypothesis testing.”
Dr. Dahlem believes the key to realizing goals with either strategy is in banking the right set of samples and linking them to what’s been determined as important patient information.
“While there are many factors to consider when deciding the best approach to a company’s biobanking strategy, some points to consider include the critical nature of the sample and the immediacy of analysis, the capital cost of purchasing and maintaining freezers or other storage devices, information technology needs, ease of retrieval, and whether sample storage is viewed as a laboratory’s core capability.”
Biobanking is rapidly becoming an integral aspect of drug development and precision or personalized medicine,” maintains Dr. Dahlem. “Directed biobanking coupled with good laboratory science is essential to translating the observed pathophysiology of patients into new pharmaceutical agents.”
“One example could be a specific protein with an altered concentration relative to a targeted disease state and a normal population,” he explains. “The observed differences between protein concentrations may help us design interventions to restore the patient to normal levels. A high-quality biobank is essential in order to accomplish this type of analysis.”
Many institutions are currently establishing precision medicine programs that will systematically collect tissues samples from patients with metastatic cancers that have failed several lines of therapy. These samples will undergo genomic analysis using massive parallel sequencing technologies with the aim of identifying “driver mutations” which could then be targeted by FDA-approved drugs or by drugs currently in clinical trials.
“Developing biobanks with the highest possible standards will be essential to precision medicine efforts,” notes Dr. Dahlem.
Timothy J. Geddes, who manages the Erb Family Core Molecular Laboratory, Beaumont Biobank, William Beaumont Hospital, will highlight the importance of optimizing and customizing methods to maximize the yield of each specimen.
“Every specimen contains valuable information, no matter where it came from, how old it may be, or how it was processed and preserved. Its wealth lies in determining the most optimal way to garner that information” says Geddes, who also stresses the critical need for quality control regarding specimen-handling conditions in the biobanking process.
“Quality control measures and metrics to assess biospecimen utility are the lifeblood of a successful biobanking endeavor. Even though every institution has sample processing and storage procedures that they believe are ‘the best,’ there need to be ways to validate these claims and unify sample quality standards through downstream assessment, which compliments upstream operations.
“As the sharing of samples between institutions becomes more of a reality, it is vital that the integrity of specimens used for biomarker studies, for example, is held to the highest possible standards to ensure the validity of the data generated and the confidence in the application of the resulting new findings to clinical management,” explains Geddes.
He goes on to describe two promising QC metrics designed to keep up with emerging proteomic and genomic analysis technologies.
“The size metric analysis RNA threshold (SMART), which is used for determining the usability of highly degraded RNA, such as that which is isolated from laser capture microdissected formalin-fixed paraffin-embedded specimens, is one,” he says. “SMART is designed to replace the RNA integrity number (RIN) when dealing with highly fragmented RNA in determining its utility in downstream applications, such as expression analysis by gene chip technology.”
The second metric addresses the quality of protein content. “The sample-specific protein integrity number (SPIN) employs SELDI-TOF MS to identify labile and stable proteins to formulate an assessment ratio in specimens intended for protein biomarker analysis,” says Geddes.
“Every biospecimen type will have its own integrity SPIN fingerprint to determine its utility. The resulting spectrum data identifies both stable and dynamic protein peaks by their masses. The intensities of select characteristic protein peaks in turn are used to formulate an index, which reflects the quality of the protein content of the specimen. The intent is to further develop SPIN to encompass every specimen type, tissue, fluid, etc.”
Operating a biobank inevitably involves some risks to patients’ privacy. Andrew Brooks, Ph.D., COO of the Rutgers University Cell and DNA Repository (RUCDR) and associate professor of genetics at Rutgers University, will discuss the potential of identity theft in a biobank.
“First, we need to protect the identity of a sample through good QA/QC programs engineered into workflows to help prevent the contamination of any given biosample,” he says.
An example of an approach to consider is the RUID™ performance and identity panel, he explains. This is a technology developed at the RUCDR that “fingerprints” every DNA sample to create a unique profile for every subject. It provides data on sample performance, ethnicity, gender, parentage, and sample uniqueness. This allows for both clinical data reconciliation in real time and for positive sample identification in any study design.
The RUCDR also protects a subject’s identity through validated and regulated processes for clinical data management and biosample ID assignment, continues Dr. Brooks.
“For instance, primary clinical samples that are received at the RUCDR are assigned a unique random ID that is desegregated from any study information at the time of accessioning. Data from the repository, including analytical data, is never co-mingled with any subject information at any point in the sample life cycle,” points out Dr. Brooks.
© 2016 Genetic Engineering & Biotechnology News, All Rights Reserved