Sponsored by

Brooks Life Sciences Logo


Moderated by Andrew Brooks, Ph.D.

Dr. Brooks: Let’s start the discussion with an eye on both research and clinical use of biological samples. Are there globally recognized standards for specific storage temperatures for specific biomaterials, and what is the importance of associating sample freezing techniques with those storage requirements? Is there a need for qualification of consumables, sample storage tubes, media, and other consumables to ensure the quality of biomaterials over time and across temperatures?

Dr. Hubel, given your expertise in cryobiology and sample preservation, I was hoping you might be able to provide the first comments.

Dr. Hubel: In general, we need to store samples at a temperature at which we prevent degradation of the sample. There are two basic mechanisms for that degradation. One involves activation or deactivation of molecules inside the sample that act to degrade the cell or the tissue or the protein. The second is the behavior of water, because we need to be able to store samples at a temperature at which water is no longer mobile and, therefore, cannot participate in a degradation reaction.

We do not know all of the deactivation temperatures for every molecule that can potentially degrade a biological sample. That being said, we do know that certain molecules are active at fairly low temperatures, let us say less than –80°C. So there is the potential for storing samples at temperatures low enough to make sure that all of the molecules inside the sample are essentially deactivated.

In terms of freezing technique, it is well known that the freezing of mammalian cells is strongly influenced by the cooling rates, and there is a certain amount of evidence that indicates that other types of biospecimens (e.g., tissues, proteins, whatever) may be influenced by cooling rate, as well.

It would be helpful, even in a retrospective manner, if we understood the temperature history of a sample so that we could determine whether or not we have had a consistent cooling rate or that if the cooling rate has varied a lot. This would especially be the case if the cooling rate affects quality of a sample for samples in which we are not sure whether or not the cooling rate has had an influence.


Dr. Brooks: Do those same principles apply to other types of biomaterials that are derivatives, such as a nucleic acid that comes out of the cell, or some enrichment of a specific biological analyte? Should we take these same principles into consideration with respect to water, buffers, and freezing media?

Dr. Hubel: Once again, the question is: If we know that information, even if we do not control it, we can at least go back retrospectively and look and see if certain processing conditions, whether they are cooling or thawing conditions, influence the downstream quality of the sample.

I have seen a publication that has systematically looked at the influence of a cooling rate or a warming rate on the quality of the nucleic acid. But at least if you have knowledge of all of those parameters, you can go back and retrospectively look at your data if you have outcomes.


Dr. Brooks: Do you think that there is a need for the qualification of consumables, such as storage tubes, that could be used to ensure the quality of biomaterials and prevent sample degradation as a function of leachables or other contaminants, or do you find that a cryovial is a cryovial is a cryovial?

Dr. Hubel: The short answer is yes. The long answer is the process is the product—meaning what you do to your biospecimen during its processing—influences the quality and the end uses of that material. That includes any reagents or disposables that are involved in the process. That has been well documented for particular cells being used therapeutically.

I would assume the same kind of issues are present when we are talking about nucleic acids or any other derivative that might be used for a downstream application.


Dr. Brooks: Sample identity can be a challenge with potential errors occurring at many points in the sample life cycle. These errors are often identified after analysis has been completed, and in some cases are only found by chance.

The next question is how can the field standardize sample quality assessment not only to identify sample errors, but also to predict and ensure that the analytical outputs are high quality? These things exist in analytical labs, but what tools can be created to bridge the gap between sample processing, storage, and analysis to create more robust datasets?

Dr. Levy: I think similar to what Dr. Hubel just said, the “process is the product,” and being in the analytical space, oftentimes you are a bit in the middle and have, as you mentioned, limited control over what happened to the samples, or particularly how the samples were collected or even processed.

A good example is that part of a historical and well-worn system of sorts is the collection of fixed-formalin paraffin-embedded (FFPE) tissues. These are amazing biobank materials, but they are generally collected and processed for histopathology, and therefore will not and probably will never be handled in an optimal way for other derivative purposes.

Now, if we are looking at this from the perspective of how can the most information can be derived from these samples, and how can the maximum value be derived, and what sort of standard can be introduced, I think there are significant opportunities.

One way to bridge that gap from a sample integrity perspective, similar to other processes, is you design and implement process control that first helps maintain the integrity of samples. That can include the types of tubes and the way those tubes or containers are used to be sure that sample cross-contamination is kept to a minimum during shipping, handling, and other events like that.

Secondary to that, or perhaps in terms of better opportunity, is the development of standards that can be added into the samples, depending on what the analytical outputs are. So, using genotyping or sequencing or other nucleic acid–based analyses, several groups that have designed and implemented tools that allow nucleic-acid spikes or endogenous nucleic acids or endogenous materials to be added to samples that can then be used to assay their integrity or the overall chain of custody and identification of those samples.

Then, perhaps tertiary to that is the use of assays, fingerprinting assays or other analytical assays that are done in a manner to allow more complex or more extensive assay integrity to be assessed. So, as part of that process, it is the product.

If we combine those policies—procedures and tools, like endogenous spikes or fingerprinting assays—or if incorporate direct-to-consumer assays as they become more sophisticated and more uniformly available, we have an informational problem. Can we validate a sample based on information that is in a medical record, information that is in a direct-to-consumer assay, etc.? And that becomes, perhaps, a fourth type of integrity measurement that is outside of the given assay.

Overall, the biggest opportunity in terms of design and implementation is in those endogenous spikes, the idea that when a sample enters a laboratory, there is an addition of a standard, and that standard is then traceable with that sample all the way through the process, and can be read at the end.


Dr. Brooks: Following this train of thought, would more information about the history,
storage, and preservation of that sample be useful to analytical labs? Appreciating that standardization would be great,
although it is a tall order, given the diversity of labs that are working with samples.

But is there additional information that would make your job in assessing sample quality or the output of that data more straightforward or more robust?

Dr. Levy: Absolutely. In fact, using the transition from tissue to isolated nucleic acid as a specific example, knowing the history of that sample, how it was stored, processed, held, etc., is very valuable in terms of determining the optimal nucleic-acid isolation conditions to get the highest quality of nucleic acid out. The more information, the better.


Dr. Brooks: Following up on this discussion, the next question is in relation to direct-to-consumer sample management and collection. We all realize that in the future there will be more biological samples collected from direct-to-consumer avenues than many other traditional sources that are used today.

Many direct-to-consumer activities focus on different objectives other than traditional research and clinical biobanks. Yet the use of the samples and data can be very similar.

What principles around sample chain of custody and sample quality can be harmonized across these activities to ensure that the client, subject, or patient interests are fully leveraged? Is there an opportunity for subjects, patients, or clients to ensure that their samples are being responsibly managed and identified correctly given that today it may be a wellness report, while tomorrow it might be a biological product that would be used in their standard of care for any health-related issues?

Mr. Hamilton: LifeVault is obviously a direct-to-consumer storage of biomaterial company. Our main concern is ensuring, in a blinded fashion, the integrity and the identity of the sample.

We provide customers with complete control of their biosample, so the integrity of how we process and store their samples, and the part where we choose to do that, because that is not our expertise, is essential for us. So, the ability to go back to a commonly used phrase, the process is the product for us.

We need to know where that sample has been at every single point in time and what temperature to which it has been exposed. We need to be able to back-reference identity because the materials that we are storing are going to be personal in nature. So they are not to provide a generalized set of information to be made available to everybody. They are to be made available to the individual and for the individual’s use only.

The process to ensure integrity and identity, I think, is core to that, as are things that groups like Dr. Brooks, are integrating into the process to ensure that integrity and identity can be maintained. That would be the core principle that we apply.


Dr. Brooks: With the data collected in the direct-to-consumer avenue, I see the potential for information about the patient or the subject to become fragmented. Given that consumers are now taking part in their own healthcare and wellness, do you think that there are opportunities, to provide resources for them to link up data so that it can be used more productively?

Mr. Hamilton: Yes. At least what we are seeing in the consumer space around this is that people want to own their own information. How that information is controlled and accessed and regulated is extremely important because clearly, you do not want that information available to everybody.

It is also important to be able to leverage the capabilities that exist within information management systems and automated data retrieval systems to provide that direct access not only to the consumer but also to the physician because a lot of these activities will be regulated through their healthcare provider, even though they are going to have a somewhat direct-to-consumer experience in terms of ownership.


Dr. Brooks: Following the theme of the process is the product, and reflecting on the many advances in assessing the effect of preanalytical variables on sample quality, how do we take the next steps in regulating this activity? Is this something that should be considered for national or international regulation? If so, should it be limited to specific biomaterials and specific points in the sample life cycle? What are the needs for lab automation, consumables, and sample management tools and associated technologies to create a
more robust process?

Dr. Betsou: Today there is more and more evidence concerning the impact of preanalytical variables on the analytical results. You discussed global standards, and global standards makes me refer to ISO standards, which are international standards.

As you know, there is the ISO 15189, which is an accreditation standard for clinical laboratories and which is, to my knowledge, the only standard that integrates the preanalytical processes as part of the standardization process.

Now, the ISO 15189 is one thing. It concerns clinical laboratories. The scope is limited there. Talking about more globally applicable standards, there is also an ad hoc board, the Technical Committee 140 of the European Committee for Standardization (CEN TC 140) that is developing a series of technical specification documents specifically focusing on the preanalytics.

There is one difficulty when you talk about global standards, that is, standards that would apply to the whole chain—from the research, the discovery to the validation to the clinic, and the clinical laboratory. Of course, the challenge is that we are talking about different needs.

For example, the different time frames: a clinical laboratory will store the samples and will analyze them after a couple of days or weeks, maximum, in contrast to a biobank, which will store the sample for years or decades before the sample gets analyzed. So, different time frames lead to different needs in terms of stability. Also, different purposes.

For the analytical laboratory, the purpose varies depending on the defined analyte that will be measured. For the biobank, the purpose seldom varies during collection and storage.

I say this because what we observe in these international standardization
efforts is usually the minimal common denominator. For example, sometimes we see that the storage temperature is mentioned, you know, –20°C or below, which may be okay for a few days in a clinical lab, but is not optimal for a biobank, which will store samples for decades, as Dr. Hubel explained.

Now, about regulating this activity. Again, regulating or not, I think it depends not so much on the type of specific biomaterials and not so much on the point in the sample life cycle, but more on the purpose. So, if we are in the preclinical research area, where the purpose is discovery, I do not think regulation would be a good solution. On the contrary, a place more downstream in the process, in the validation phase, is appropriate for regulation. When we are in the context of regulatory submissions, regulation is necessary because there we must be in a process where everything is under highest quality standards.


Dr. Brooks: That raises an excellent point. What we are seeing now in the field is the potential for repurposing discovery samples for clinical applications. So, how do you feel about standardization for preparation
of samples?

Dr. Betsou: In this context, the traceability is, as has already been mentioned, the most important. We should document all the critical preanalytical variables, preferably in a standardized way. You may be aware of standardization approaches such as the Standard PREanalytical Code (SPREC). There may also be other ways.

Essentially, we must document and be able to use these preanalytical variables, on the one hand, as co-variables used in the statistical analysis of the data when we are in the discovery process and on the other hand, as data necessary to assess the fitness for purpose of the samples when they are to be used as you just described.


Dr. Brooks: Do you feel there is a need for some standardization or harmonization around the use of specific lab automation or consumables or sample management tools? What is the role of technology in creating a more robust process? Is this something that is going to be too difficult to achieve? If there was specific technology for standardization of these applications, would that help the community achieve this goal?

Dr. Betsou: Yes. Absolutely, because we are witnessing the problems that occur due to the lack of workflow integration, and which lead to errors, mistakes, and jeopardize this whole process. And a few years ago, there was a series of case studies that were publicly available, actually, on the ISBER website.

There was a working group on integrated workflows, and there is a series of case studies where you can look to see concrete examples of how important this workflow integration is, including standardization in terms of lab automation, consumables, barcodes, labels, or the technologies to ensure the robustness of the whole process.


Dr. Brooks: There has been a lot of movement toward the consolidation of large sample collections to deal with many of the issues that we have been discussing. That is the variability in laboratories that do not have the experience, the regulatory oversight, or the process right to create that product.

In your collective opinion as a panel, how centralized should large programs become? Should there be different standards for different types of biobanks, whether you are working with cells or you are working within nucleic acid–based on the size and types of biomaterials they manage?

Do you feel that this consolidation activity would have a positive impact on sample do-it-yourself and quality, whether you are coming at it from the analytics perspective, the direct-to-consumer perspective, or basic research?

The question is as follows: Is consolidation of biomaterials a good thing, and how big should that consolidation get before it becomes unmanageable or not as useful? Does that help address the variability that we talked about in the earlier questions?

Dr. Betsou: Centralization, I think, depends again on the type of biomaterial and the end use. In general, it is a good thing because it ensures more consistency in the quality of the output. But in a case where there is a processed type of biomaterial and downstream analysis which, for logistical reasons, is not possible or very difficult or very costly to centralize—if the materials have to be processed in one hour after collection—then, de facto, things cannot be centralized, and so the processing will take place in a decentralized way.

However, in this case, there are things that can be put in place to ensure at least a maximum consistency and standardization. One such thing is ring trial. So, external quality assessment programs may allow the processing to take place in a decentralized way. In such a ring trial or External Quality Assessment (EQA) program, the purpose is to ensure that more than one site processes the samples in a similar way.

Mr. Hamilton: I would just add, from the centralized perspective, it certainly makes sense from a consistency perspective and the implementation of protocol and practice. Obviously, I do not think that can be wholly centralized to a few key players based on the types of biomaterials and the specialties required to manage and manipulate them.

I do think based on my background in stem cells, it is inherently variable from lab to lab and institution to institution about how cell lines behave under standard protocols. So the ability to manage and get expected results off of a common biomaterial in multiple labs and sites is very difficult.

In that way, I think a centralization does create a bigger benefit in that respect. But I do think ultimately that it would be dependent on the type of biomaterial that you are storing.


Dr. Brooks: What is missing today in the current sample chain of custody and sample storage processes and technologies that service providers and technology providers should be focusing on to create a better integration of today’s biobanks? What will be used or required to incorporate sample collection and analysis in a true standard of healthcare?

As Dr. Betsou pointed out earlier, decisions are made on programs about preanalytical variables and regulations of processing based on what the samples are going to be used for downstream. If it is clinical, there is a specific process and a different accreditation. If it is discovery, there is a different process and accreditation.

If biobanks and cryopreservation and cellular preservation are going to become clinical realities as a part of our standard healthcare process around the world, what technologies are missing, and what integration is required to have this type of activity make that transition?

Dr. Hubel: I think that much of what is done in the preservation field really reflects technology and approaches from the ’70s. We are asking for so much more from what we are able to get from our biological samples. And a lot of my work is in the field of cell therapy, and the thing that is limiting the growth in cell therapy is the development of a manufacturing paradigm.

Because of the issues associated with the cell therapy, that is, needing a viable and functional cell, the supply chain issues for cell therapies are really critical in all of that, and involve preservation.

In this particular aspect of biospecimens, the development of a manufacturing paradigm, meaning that there is technology to support consistent, high-quality processing all along the manufacturing process, is what is limiting the field. And in particular, for preservation, that means understanding the complete temperature history and also the quality of what is going into the system and meaningful metrics of post-thaw assessment, as well.


Dr. Brooks: Now, those are excellent points. Dr. Hamilton, do you have a perspective, given that the models and the products that you are creating have that dual focus, where you are providing information for people to make decisions, which is not specifically clinical in nature, but at the same time you are preserving a clinical product that would be used directly in someone’s healthcare? Are there any technologies or approaches that are currently missing, in your view, to help firm up that integration?

Mr. Hamilton: I mean, obviously, we are working with what is currently the standard of practice. We want to work to the highest state of existing technology. The development and validation of systems that integrate all that information and provide that comprehensive record for us to be tracking is important.

Jumping back into the cell therapy space within cryopreservation, and to reaffirm what Drs. Betsou and Hubel said, instituting newer practices and processes in a standardized way to ensure consistent viability of cellular materials to enter a manufacturing process is going to be a key to transitioning these biomaterials into the cellular therapy space.

Ultimately, a lot of money and resources are being used to create these opportunities to manufacture either allogenic-  or autologous-based therapies. But some of the practices and the protocols that are utilized to preserve these core materials that feed into that are, I would say, in some ways outdated or have not been reevaluated for a higher standard, given that it is going to go into a cellular therapy process.


Dr. Brooks: I will add one comment, given that our service organization works with pharma and biotech on research and discovery and that we work with diagnostic partners on actually running diagnostic tests that are used for patient care.  I can tell you that the lines—they are not even blurring anymore. It is becoming a single pipeline for the collection and processing of samples.  The environment is regulated so that there is no limitation for their use downstream, allowing a transition to using those biomaterials for healthcare.

Certainly, cellular preservation in cryobiology is critical because that is living tissue. But the processing of those samples into analytes be done in a manner so as not to limit the usage of those samples.

And this is not just about the sample process, even though we are standardizing chemistries and consumables. It it is also about standardizing consents for the use of those samples—consents as a function of research (and what the samples can be used for later on in the development of commercial products) and as a function of commerce, with terms and conditions which are, in effect, a type of consent for the direct-to-consumer marketplace, where you are assigning or agreeing to the use of those materials. That data is necessary not only for the product that you have brought on, but also for the development of other things.

What we are seeing is that those can be very different. Certain companies will use the data and the samples for the development of other products in a non-identifiable way. But there are certain companies that are limited to your biomaterials because they could ultimately be part of your own independent healthcare. So, do you have a final comment, Dr. Hamilton?

Mr. Hamilton: We ascribe to an opt-in model on this, but we have terms and conditions at the start that basically say you own the materials, but we do not have any rights to ultimately commercialize off of that, off of your materials, but we have the rights to analyze your materials to provide opportunity. I think as you are looking at biomaterials as a resource for healthcare, we have to have the ability to look into those materials to determine the best possible quorum.

But we do recognize that this is a personal product, a personal asset; therefore, our model is that they have to opt in and understand those terms and conditions, that we have the right to analyze and present new opportunity to them. But when that time comes and we present them new opportunity, they have to consent to that commercial application, and we aggregate all that information such that in the future, if we can make it available to a study or to a particular foundation or a group to help develop a new therapy, we have consolidated all those records.


Dr. Brooks: Is there an opportunity to create an overarching framework that better integrates these entities—research, clinical, population-disease-specific, and direct-to-consumer biobanks—and the processes that they run to preserve chain of custody and sample quality across all areas?

What is the string across these entities that we can tie to in our respective fields to provide the best-quality biomaterials and the best-quality data?

Dr. Betsou: Yes, there is an opportunity today for this. The opportunity is created by three things: standardization bodies, regulators, and scientific journal editors because we incorporate the research component in this whole continuum. Clearly, the journal editors have a role to play in implementing this overarching framework.

What is needed, I think, is kind of a horizontal standard. Everybody needs to recognize the importance of the process—what Dr. Hubel mentioned as the manufacturing paradigm—because when we are in the clinical area, all of the focus is placed on analytical measurement. And, of course, we may mention preanalytics, but it is an accessory. The focus is placed on the analytical.

We need to recognize the importance of all the upstream tests so the whole workflow will address, the reproducibility of the process, for example, when we are in the research area. And when we are in a clinical area, we need to address, maybe in priority, the precision of the measurement.

When we are in population-disease biobanks, we also need to address the stability of the samples. When we are in the direct-to-consumer context, we need to recognize the importance of robustness in terms of shipment conditions and so on.


Dr. Brooks: In the field of cryobiology, whether samples are being used for basic research, technology development, or even, in this instance, clinical applications, is there a string across or a theme across these that you find is most important for the implementation of more standardized cryobiology technologies, understanding it is from the ’70s? Moving forward, any advice or opinion you could provide?

Dr. Hubel: When I heard your question, the first thing I thought about in terms of a string connecting everything is the importance of education and training, the support for issues like best practices, because those fundamentals span all of the different applications. Each particular type of biobank may have its own QA/QC or training standards.

But all of them need to understand, first of all, the scientific basis for the preservation process, the manner by which that scientific basis gets manifest in terms of protocols. And then, because we are human beings and we fail, we need some sort of oversight, whether it involves proficiency training or an audit function or something of this sort to keep track of whether best practices or protocols, or whatever measures are appropriate for that given biobank, are being actually implemented.

Mr. Hamilton: I would jump on the education side of things, the communication of those standards. Oftentimes, at least in my past fields, which include stem cells, and in the biobanking industry, there is an inconsistent understanding of what the current practices are and a difficulty in getting those standards communicated across the industry.

For example, we collect samples with the idea that they are going to have the highest utility, ultimately diagnostically or therapeutically. That is our target. But as you mentioned, at the end of the day, there are a lot of biobanks that exist for a research environment.

If there are minor changes or minimal costs to be absorbed, that could change the upfront accumulation of a research sample to be usable across the entire workflow or the four different biobanking strategies. Communicating that could increase the utility of the biobanks that exist.


Dr. Brooks: That brings us to the end of the discussion. I think that all the perspectives that you provided—with respect to sample preservation, understanding preanalytical variables, how this translates into products, both cellular and diagnostic, and the analytical perspective of the importance of sample quality—is really critical.

And even though we all operate in our own different spaces from a day-to-day perspective, having you all together and hearing your opinions on these global questions that have no right or wrong answers, I think, is going to be extremely useful to the community that we serve.

Thank you for the time that you have taken to participate in the conversation. Truly, exciting times are ahead for all of us. There are going to be new challenges
and certainly new advances.

Previous articleThe Stem Cell: Life’s Goblet or Poisoned Chalice?
Next articleThe Evolution of Biobanking