November 1, 2014 (Vol. 34, No. 19)
Biological assays drive basic R&D, the identification of new biomarkers, and subsequent diagnostic testing. Two recent conferences highlighted new insights, detailed updates, and showcased novel technical advances in the field.
IBC’s 2014 Bioassay Conference emphasized not only the importance of verifying that assays are operating at full potency, but also the proper handling of potentially meaningful data outliers. CHI’s Biomarkers and Diagnostics World Congress showcased innovative assays that are challenging the sensitivity and specificity of traditional ELISA approaches.
New developments include acoustic assays that utilize vibrating biosensors to identify biomarkers as well as glass nanoreactor assays for evaluating multiple analytes in individual microfluidics cartridges. Despite these refinements, the field continues to face many challenges with a rather sluggish record for developing clinically translatable assays.
Although many studies have examined the use of sophisticated -omics approaches, little progress has been made in translating these discoveries into clinically useful applications for improving diagnosis, therapeutic choices, and monitoring, suggested Corinne Solier, Ph.D., head of extramural research, pharmaceutical sciences at F. Hoffmann-La Roche.
“In fact, the introduction rate of biomarkers into clinical use has been static at approximately one to two per year for the past 15 years,” complained Dr. Solier. “Furthermore, despite significant technological developments, unbiased proteomics have failed to translate protein biomarkers into clinical use.
“Technological limitations mainly relate to sensitivity, accuracy, and reproducibility. Lack of well-established methods for validation of candidate biomarkers in large clinical sample sets also has held back biomarker research. Poorly defined cohorts of controls together with huge biological variability among patients, and preanalytical sample variability present significant challenges.
“Further, the broad dynamic range of proteins in samples, especially in serum or plasma, raises another technical issue for protein biomarker discovery. Altogether, these factors represent significant roadblocks to development of novel, clinically relevant biomarkers for prognosis, diagnosis, and therapy monitoring.”
Several new approaches are helping solve some of these issues. “Technologies combining multiplexing, sensitivity, and specificity are the holy grail of biomarker research,” proclaimed Dr. Solier. “Although antibodies may sometimes lack specificity and contribute to false discovery, they can be used in novel ways to enhance the functional detection sensitivity of proteins of interest, rather than method specificity.”
An example is combining antibody-based tools with selected reaction monitoring mass spectrometry (SRM MS). Antibody-enriched SRM can be utilized for biomarker identification and applied to both tissue and biofluid discovery. SRM exploits the unique capabilities of triple quadrupole MS to derive a quantitative analysis of well-defined peptidic sequences that enhances selectivity.
“In a nutshell, while unbiased proteomics (MS) approaches achieve sensitivity in the range of microgram to milligram per mL, antibody-based technologies including reverse-phase protein arrays, antibody-enriched SRM, and various immunoassay platforms reach sensitivities below high picogram per mL concentrations and even as low as sub-picogram per mL ranges with high-sensitivity immunoassay platforms.”
Dr. Solier ended on an optimistic note: “Although biomarker discovery has been slow, new progress is being made especially by combining multiple omics technologies to increase confidence in the data. We may be able to dramatically improve biomarker identification using more stringent methods to curate the existing data by adding in the power of systems biology.”
Knowing if an assay is “working properly” or if biological components are going bad requires constant vigilance. “As with all analytical techniques, to judge the potency of bioassays one must use objective criteria,” remarked Michael Sadick, Ph.D., senior manager, large molecular analysis and characterization, Catalent Pharma Solutions.
The potency of most bioassays is gauged by comparing dose-response curves of the test sample to a reference standard. “However, not every relevant factor can be controlled in assays,” noted Dr. Sadick. “This makes it critical to assess data produced by each individual assay to see if the assay was actually performing correctly.
“In a recent whitepaper (BioProcess International, January 2014), we proposed a two-level, sequential examination of valid acceptance criteria. The first step is to determine if the control material (we proposed defining this as assay control sample) behaves similarly to the reference standard as well as providing a determined relative potency that is within a defined range. If this test is failed, the data derived from the entire plate is judged invalid.”
Dr. Sadick said only when the first step passes does one proceed to the second step of testing each sample’s potency.
“Test samples are next subjected to what we call sample acceptance criteria (SAC). This is based on the responses of each separate test sample,” Dr. Sadick noted. “SAC basically means tests are applied separately to each test sample. Whether they pass or fail is independent of each other. The shape of the sample curve should be similar to that of the reference curve.”
Whether one is setting assay acceptance criteria (that is, assay control sample potency) or monitoring/trending other aspects of assay performance, one should utilize appropriate statistical analysis. It is one of the most powerful ways to analyze data.
“When statistical analysis is applied to the derived observed values, one can obtain a statistical process control (SPC) chart,” continued Dr. Sadick. This permits an objective evaluation of the variation observed in assay performance. It also allows limits to be set that will indicate when an action must be initiated to prevent the failure of an assay.
Dr. Sadick’s take-home message is that multiple assay components all contribute to the overall performance of the bioassay. From instruments to cells, to reagents, to buffers, emphasized Dr. Sadick, researchers need to realize that all need to be within defined limits to ensure assay validity.
Handling Data Outliers
In a perfect world, data from biological assays would have a bell-shaped or normal distribution curve with small variation around the dose-response curve. Most biological assays, however, have outliers, points that appear unusual.
“Making a statistically driven decision to remove outliers without first making sure that the observations come from a known distribution (typically normal) is irresponsible,” said David Lansky, Ph.D., president, Precision Bioassay. “You can often substantially improve the performance of a bioassay by choosing an appropriate scale (transformation), applying a good outlier detection method, then doing the rest of the bioassay analysis.”
Although scientists typically think of outliers as having come from procedural errors, it is important to not simply remove them and move on. It is valuable, Dr. Lansky added, to periodically review the history of outliers in an assay system, looking for patterns associated with analyst, reagent source, or age, within-assay location, duration, sequence effects, etc.
According to Dr. Lansky, there are five parts of a good outlier management system:
- Fit all the data in an assay together, usually after a transformation that stabilizes the variance.
- Capture the design structure (samples, doses, and groups of observations such as cell culture plates) in the assay. It is important to model dose with a nonparametric regression that fits a smooth curve rather than one that imposes a specific shape.
- Use the statistical machinery of outlier detection to determine which observations are unusual.
- Investigate these observations and examine their impact on the estimates from the bioassay.
- Periodically review the assay system, looking for patterns in the outliers.
The company provides several types of services related to statistical issues in biological assays: consulting on assay development, qualification, validation, and web-based software for analysis of biological assays. Such software includes useful methods for the entire outlier management process.
The interface guides users along and even asks questions as to why they may want to remove data points. Also, it archives the data, the analyses, and the decisions so that users can assess their whole assay history. Dr. Lansky stated that combining good experimental design with state-of-the art statistical analysis could lead to high precision, ultimately saving time and money.
BioScale has developed Acoustic Assays, a technology for the development of biomarker assays. “The application of this technology was originally reported by developers at Millennium Pharmaceuticals who were trying to develop pharmacodynamic assays of tumor extracts, but who found that heme in their samples interfered with their optical assays,” explained Martin Latterich, Ph.D., BioScale’s CSO.
The ViBE platform uses BioScale’s new Acoustic Membrane MicroParticle (AMMP) technology, which integrates microelectromechanical systems (MEMS) sensors, biological capture strategies, and magnetic microparticles to create a sensitive, nonoptical interface for hands-free protein detection. Current assays on the platform support biomarker and pathway analysis targeting oncology, inflammation, cardiovascular, CNS, and some metabolic diseases. “Pharmacokinetic and other ligand binding assays are also being developed,” added Dr. Latterich.
One configuration of the assay is similar to a homogeneous sandwich ELISA. Sample and calibrants are incubated in a microtiter plate with capture antibody-bound to magnetic microparticles and with labeled secondary antibodies. The mixture is then delivered to the antitag biosensor that is vibrating at its natural frequency.
A magnetic field attracts the microparticles to the sensor surface. When the magnet is released, only the biologically bound microparticles (bearing the analyte of interest) remain adhered to the sensor. As the buffer continues to flow, the microparticles are washed away in accordance with the strength of sensor binding, which correlates with the analyte load. The surface is regenerated by buffer continuing to flow.
“There are many advantages to our acoustic assays,” asserted Dr. Latterich. “They allow up to 10 times the sensitivity of the best optical ELISA, down to picograms per mL. Because this is a nonoptical detection, it can be used with serum, plasma, tissue lysates, urine, and tissue/tumor extracts. It’s a set-up and walk-away system that provides excellent day-to-day reproducibility. It’s also very economical since it uses less than 10% of the antibody required in a typical ELISA.”
The company currently provides about 40 specific tests, and the ViBE Universal Assay Development Kit can allow customers to prepare customer assays in their own lab in only a few days. “We will continue to broaden our menu of tests,” noted Dr. Latterich. “This is a technology that allows users to rapidly and cost-effectively develop novel, reproducible biomarker assays with exceptional precision.”
Glass Nano Reactors
Because of their reproducibility and specificity, bioassays such as ELISA often analyze analytes individually. However, evaluating multiple biomarkers in a single sample could greatly enhance the level of information obtained, such as identification of protein networks as well as enhancing prognostic and diagnostic procedures.
“The conventional multiplexing approach is often relegated to preliminary research applications because of cross-reactivity concerns, compromised performance, and labor-intensive complex protocols,” said Rajiv Pande, Ph.D., vp of scientific affairs at CyVek. “The primary criticism is that multiplexed immunoassays lack reproducibility and don’t correlate well with conventional single analyte tests.”
To address this problem, the company developed an immunoassay technology, the CyPlex™ system. It integrates a disposable microfluidic cartridge and an automated compact benchtop analyzer. “This is a multianalyte but not a multiplex platform,” clarified Dr. Pande. “Traditional assays such as ELISA are highly specific because each antigen is paired only with its specific antibody. Our system retains this specificity trait because in the CyPlex cartridge, each sample protein or antigen is reacted only with its corresponding specific detection antibodies.
“A cartridge can analyze 16 samples, requires only 20 microliters per sample, and provides 4 results per sample. The entire cartridge is run in 60 minutes. Currently, we have more than 50 off-the-shelf assays, and we also provide custom assays.”
The cartridges employ a unique architecture where each sample is split into many microfluidic channels. Each channel has a proprietary solid phase consisting of silica glass nanoreactors (GNR), pre-immobilized with appropriate capture biomolecules. However, each channel contains only one kind of GNR. Thousands of GNRs are batch-immobilized to provide large single lots of different antibodies.
CyPlex’ cartridges constitute a network of isolated channels that are manipulated via microfluidics-based mechanisms. “This is important because the bound sample analyte in a GNR is never exposed to reagent cocktails that contain multiple nonspecific antibodies,” emphasizes Dr. Pande. “Additionally, development of multianalyte assays becomes easier and faster as development efforts are limited to finding a good antibody pair for single analytes.”