June 15, 2009 (Vol. 29, No. 12)
Getting Regulations up to Speed and Clinicians on Board Will Be Key
There are many applications for multiplex assays, but we are only now realizing the full potential that this technology has to move compounds through the clinic more quickly and inexpensively.
Multiplexing is a life science lab research approach that performs multiple sets of reactions in parallel—simultaneously—and greatly increases speed and throughput. Multiplex assays can be further stratified, based on how many assays can be performed at a time.
“You have to be able to distinguish and define high throughput and multiplexing. How many samples can you do in parallel, versus how many analytes per sample? Understandably, these terms have fused and are used interchangeably,” noted Don Baldwin, Ph.D., director, Penn Microarray, molecular diagnostics and genotyping facilities, University of Pennsylvania, who spoke recently at CHI’s “microRNA in Human Disease and Development” meeting.
“For genotyping, are you talking about sample throughput or the ability to do lots of different SNP assays? Multiplexing doesn’t address how many people you can survey, but it does account for the ability to measure many things. High-throughput capillary sequencing is usually not multiplexed. Low-throughput next-gen sequencing is looking at seven samples or channels, but highly multiplexed sets of templates.
“What is the size of your cohort, and how many different things are you measuring? Are you testing in serial or parallel? Each specialized situation requires appropriate choices for multiplexing samples, analytes, or both.”
The technology is at an interesting intersection, Dr. Baldwin said. “We seem to be at a point where the technology is ahead of our knowledge of what to do with it.”
Setting the Standard
When technology rides ahead of our knowledge of how to handle it, standards become an issue. For this reason, governing bodies such as the FDA and the EMEA would do well to collaborate with both academic institutions and businesses to set the standards. The Critical Path Initiative (CPI) is the FDA’s national strategy for modernizing the sciences through which FDA-regulated products are developed, evaluated, manufactured, and used.
The CPI was launched in March 2004 to address the steep decline in the number of innovative medical products submitted for approval, despite the enormous breakthroughs being made in biomedical science. Although initially conceived as a drive to apply discoveries in emerging areas of science and technology to medical product development, the CPI has since expanded its scope to include all FDA-regulated products.
“Up until 2004, the FDA had some rather nonspecific guidelines, so the Critical Path was set up to address the safety and testing issues of potential and existing therapeutics and devices, as well as the tools that create them,” Craig Draper, Ph.D., senior product manager for multiplex assays at EMD Chemicals, a U.S. affiliate of Merck KGaA, explained at CHI’s recent “World Pharmaceutical Congress”.
One area of particular interest to EMD is drug-induced nephrotoxicity. “We have introduced new assays focusing on high-input kidney-toxicity assays and how they are being done,” concurred Scott Hayes, Ph.D., R&D director, assays. “Specifically, we have developed two 5-plex panels that detect markers for nephrotoxicity in preclinical studies using the rat model system.”
Dr. Hayes explained that drug-induced nephrotoxicity has been an ongoing concern, since studies often do not go on long enough to examine the effects of chronic, low-level kidney damage.
Several proteins were found to be specific biomarkers for kidney damage. Results were generated by Rules-Based Medicine using samples from Novartis and delivered to the CPI by the Predictive Safety Testing Consortium (PSTC). “These organizations and other members of the PSTC provided data that was used by the FDA and EMEA to develop guidelines on the use of the markers,” commented Dr. Draper.
Dr. Hayes also noted that “in the way many preclinical studies are set up with the current methodologies, it is easy to miss early-stage and low-level toxicity effects. There needs to be better sampling ability—and urine is an easy place to gather this information. The members of the PSTC screened many thousands of samples, enabling the FDA and EMEA to list the identified markers as being useful as markers of drug-induced nephrotoxicity, and data on these markers will assist with the regulatory decision-making process.
“EMD partnered with Rules-Based Medicine to develop multiplex kits that test several of the listed biomarkers, allowing companies to generate the data they need,” Dr. Draper said. “The end-user also has the option to outsource the testing, giving the customer a choice and flexibility on how they can generate the data they need.”
Dr. Hayes added that seven biomarkers have been confirmed. “Data from the two EMD panels using rat urine samples done with a collaborator in Germany showed significant fluctuations based on time points. For some test compounds, the longer the animal is exposed, the greater the toxicity over time.
“Compared to existing tests, this gives us a better idea of where the damage is and how it progresses. There is still data coming in about the damage that chronic use of drugs can do to the kidneys. Examining long-term levels of exposure is a crucial aspect of toxicity profiling. Existing tests are not sensitive to long-term, low-level chronic use, and that’s what we’re looking to change.”
As the technology gains traction, applications and opportunities abound. Personalized medicine is one area where the ability to run tests in parallel would be a boon. “The best way to service the elderly in the healthcare industry is to make it possible to test for everything in one place,” reported Steven A. Benner, Ph.D., Foundation for Applied Molecular Evolution (FfAME), at IBC’s “Tides Conference”. “As it is, patients travel from place to place to test for this and that. We need a way to make a genomic assay work—to understand from just one test what you need to know about the genetic part of a patient.”
Dr. Benner develops sets of reagents that allow detection of many DNA markers in low amounts in complex biological media. Mixing and matching these creates specific architectures that meet the specifications of the users.
“For example, look at infectious diseases; there are 100–1,000 markers that you would like to look for in a patient with a sore throat. Colon cancer is similar—for that you are still prescreening using a fecal blood test, which is not the best way to screen. If the colon cancer is bleeding, the disease has already progressed considerably,” notes Dr. Benner.
“In healthcare, we know the genetic markers for colon cancer, and we should be using those markers to give our patients much better care. You could test for diagnostic purposes and find the signal and part of the progression—if you can do this cheaply and multiplexed.”
Dr. Benner also adds that, while instrumentation and cell-based technologies have advanced dramatically in human diagnostics that target nucleic acids, novel reagents that deliver new capabilities to diagnostics architectures that target DNA and RNA are lagging behind. FfAME is working on advances in small molecule and protein reagents that permit highly multiplexed PCR, orthogonal capture, and highly selective priming, and their combinations in architectures that support oligonucleotide analysis, from detection of a few targets in circulating blood to personalized genome resequencing.
“When it comes to diagnostics, we have to do better for our patients,” said Dr. Benner.
“There is a definite move to multiplexing on many more platforms,” explained Dr. Baldwin. “We offer services for microarray-based profiling of RNA, microRNA, DNA sequences and modifications, and proteins. Molecular diagnostics is still predominantly at a point where you only measure one thing at a time. Mostly, basic research has driven our demand for highly multiplexed assays. But some of that research is finally being translated into clinical applications that use the power of multiplexed analysis.”
Dr. Baldwin noted that there are now ways to barcode the samples. “That way you can multiplex many samples, as well as analytes, thus increasing the throughput. For molecular diagnostics—FDA has approved the first wave of multiplexed tests—there is no technical reason we can’t use multiplex technology, but we are still in the process of discovering what relevance tests have and how they inform clinical decision.”
In one application recently approved by the FDA, the Pathwork Diagnostics microarray tests expressed several thousand genes to determine the tissue of origin for a metastatic tumor. “This approach is familiar to genomics researchers—take a tissue sample, extract RNA, and put it on a microarray, match the gene-expression profile to reference sets, and find the pattern,” noted Dr. Baldwin.
“But pathologists can usually accomplish the same thing with histology and immunology tools. There is a big need for this technology, and we need to be doing more of this in the clinical setting, but it must offer clear advantages or enhance and expand current clinical practice.”
Many basic research projects are at the stage of consolidating gene-expression signatures from whole-genome data sets into focused multiplex panels. More formalized clinical trials will be the next step.
“A lot of people appreciate the power of multiplexing technology. Genotyping for functional and structural genomics and measurements of gene expression and activity for functional genomics, are all applications that have benefited. Now it’s time for us to turn to translational services,” remarked Dr. Baldwin. “And to do this well, you have to get the clinicians onboard early. Every test has to be justified, validated, and implemented with patient care in mind.”
At the CHI meeting, Dr. Baldwin observed that the rapid development of several genomics platforms for microRNA (miRNA) discovery and detection has created opportunities for translation to biomarker applications.
“Detection assays range from high-throughput single-target methods, to custom multiplex panels, to comprehensive microarrays, genomic tiling arrays, and next-generation sequencing,” he explained.
“Clearly, lots of people are working on microRNAs from basic mechanisms through biomarker utility to impact on disease development and therapies. This is what it’s all about.”
Immune System Sequencing
miRNAs are critical regulators of most major cellular pathways such as cellular differentiation, apoptosis, and metabolism. Although most studies rely on the detection of previously reported miRNAs, the most powerful approach to identify and quantify expression of new miRNAs remains direct sequencing.
To this extent, Francois Vigneault, Ph.D., Church laboratory, department of genetics, Harvard Medical School, has developed a procedure to facilitate miRNA capture via ligation by barcoding samples, thus allowing multiplexing and considerably minimizing the per-sample cost on next-generation sequencing platforms. Most projects Dr. Vigneault is working on are related to the multiplexing capability of next-generation sequencing instruments. He is focused on miRNA analysis, as well as a technology they call the Personal VDJ-ome for sequencing the complete immune repertoire of single individuals.
“Last year we optimized microRNA analysis on next-generation sequencers. The great thing is that there is less than 1,000 of these in human cells; it’s not like messenger RNA, which have a much greater dynamic range and longer sequences,” said Dr. Vigneault. “Sequencing microRNAs can be affordable; a typical cost per sample on next-generation instrument is around $1,200. By combining multiple samples per lanes on a single flow cell we can currently bring that cost to around $25 per sample for microRNAs experiments.”
Dr. Vigneault also noted that microarray technology can’t keep up with next-generation sequencing. “The average price for analyzing one microRNA sample on a microarray is $400 and is limited to what is on the chip. The problem we now face is that the new next-generation sequencers we are developing are just too good at multiplexing and would require up to 5,000+ samples per sequencing run to be worth it.”
Immunosequencing is another methodology that benefits from high-throughput technology development. “We are developing technologies that will allow the analysis of the immune system of a single individual in unprecedented detail and low cost,” he added.
“Antigens, bacteria, virus, or cancer cells are recognized by receptors on B and T cells. All the millions of unique receptors required to accomplish this task are produced by genetic recombination of the V, D, and J segments, hence the name VDJ-ome. We are currently sequencing all of these receptors in a single experiment from one individual to study the evolution of his immune repertoire in time.
“With one blood draw, we will be able to tell you everything that you have been exposed to in your life and, ultimately, produce new drugs and antibodies against virtually any antigens of current diseases and the ones to come. This is only possible by developing new multiplexing technologies of extreme throughput.”
As indicated earlier, miRNAs and multiplexing figure prominently in cancer research. “I’m currently working on colon and pancreatic cancer, trying to link miRNA induction or reduction as a way of determining whether or not cancerous cells would exhibit resistance to chemopreventive agents,” commented Jennie L. Williams, Ph.D., cancer prevention laboratory, State University of New York at Stony Brook, at CHI’s miRNA meeting.
Dr. Williams uses Marligen’s Vantage™ microRNA Detection Kit, which profiles the expression levels of multiple miRNAs from many different sample types including total RNA, enriched low molecular weight RNA, and degraded RNA. The assays are configured on the Luminex xMAP® bead array allowing for the detection of multiple miRNAs in one sample. In addition, the 96-well format allows many samples to be analyzed in one run.
Dr. Williams is using this technology on a project in which her group, using derivatives of nonsteroidal anti-inflammatory drugs (NSAIDs), promising colon cancer chemopreventive agents, has demonstrated that treatment with these agents inhibits colon cancer cell growth by inhibiting cell proliferation and enhancing cell killing.
“However, very little is known about the molecular targets within the cancer cell that are responsible for this effect,” explained Dr. Williams. “microRNAs are known to control gene expression and translational repression or degradation via specific sites at the 3´-UTR of its target mRNAs.” She added that the deletion or mutation of one particular miRNA could potentially lead to the progression of disease.
Consequently, using total RNA isolated from colorectal cancer cells treated with an NSAID derivative, miRNA distribution patterns were determined and analyzed by Marligen’s multiplex bead arrays.
“The clinical response to chemopreventive agents often differs between individuals,” adds Dr. Williams. “Since chemopreventive agents exert their effect through a molecular target, and because these targets are dysregulated in most cancers, it is conceivable to suggest that chemo-resistance occurs due to irregularity at the genetic level.”
Consequently, the generation and evaluation of chemopreventive agents that address the issue of chemo-resistance is essential. Thus, a correlation of altered levels of miRNA and the expression and/or activation of its target genes may ascertain its association with poor clinical outcome.
“A number of innovations for treating cancers are becoming available,” Dr. Williams said. “Using multiplexing systems specifically evaluating transcriptional factors that are up- or downregulated in response to treatment with chemopreventive agents pulls it all together.”
“It’s powerful technology, but this is just another tool in the toolbox,” concluded Dr. Baldwin. “Plenty of doctors want to try something new. But conservatism is there, appropriately, and you need validation. There are also cost implications. A lot of technologies are expensive, and you need to show value and return on your investment of healthcare dollars. More knowledge is good, multiplexed knowledge is better.”