Jeffrey S. Buguliskis Ph.D. Technical Editor Genetic Engineering & Biotechnology News

This Proven Technology Continues To Find Its Niche in the Modern Molecular Toolkit

The advent of microarray technology has a much longer history than many realize. Ask any scientist “What is a microarray?” and the overwhelming majority will describe some form of glass or plastic film onto which a multitude of genes are attached and then hybridized with an array of fluorescently labeled nucleic acid probes. Once read, the results provide some insight into gene expression patterns. Yet this more modern version of microarrays didn’t come about until the late 1990s and early 2000s.

Humble Beginnings

Traveling back 41 or so years to 1975 (6 years before GEN’s inception), Edward Southern, Ph.D., while working at Edinburgh University, described in a Journal of Molecular Biology paper (“Detection of Specific Sequences among DNA Fragments Separated by Gel Electrophoresis“) his now eponymously named technique—Southern blotting. His method was used to identify the presence of various genes within the genome through the attachment of cellular DNA to a nitrocellulose membrane and the subsequent hybridization of the blot with radiolabeled oligonucleotide probes that corresponded to the genes of interest.  

Southern blotting was simple and relatively low tech, meaning it could be quickly adopted and employed by many research laboratories; but scientists began to clamor for higher-throughput analyses. Around the same time, two Stanford University investigators, Michael Grunstein, Ph.D., and David Hogness, Ph.D., published an Escherichia coli colony hybridization method, in which randomly cloned plasmids were plated onto agar and covered with nitrocellulose.

“The colonies to be screened are formed on nitrocellulose filters, and, after a reference set of these colonies has been prepared by replica plating, are lysed, and their DNA is denatured and fixed to the filter in situ,” the authors wrote in their 1975 PNAS article (“Colony Hybridization: A Method for the Isolation of Cloned DNAs That Contain a Specific Gene”). “The resulting DNA-prints of the colonies are then hybridized to a radioactive RNA that defines the sequence or gene of interest, and the result of this hybridization is assayed by autoradiography.”   

Contained within these techniques lies the fundamental framework for what modern microarrays have become. At that time, researchers were looking at various genes and the genetics underlying their inheritance, but they were also searching for improved methods to observe gene expression. Understanding the molecular mechanisms that activated various genes was thought to be—and still is—integral to elucidating the pathogenesis of disease.  

In 1977, another group of Stanford researchers applied Dr. Southern’s technique for measuring gene expression levels—slightly modifying the method to capture messenger RNA (mRNA) transcripts, as described in their PNAS paper “Method for Detection of Specific RNAs in Agarose Gels by Transfer to Diazobenzyloxymethyl-Paper and Hybridization with DNA Probes.” In homage to Dr. Southern, the scientists labeled their new assay Northern blotting.

Moreover, investigators were beginning to engineer equipment that could pick colonies, like those described by Grunstein and Hogness, which were grown into 144-well microplates. The 144-pin setup allowed the researchers to replica plate into arrays of 1728 individual colonies onto a 26 × 38 cm plate. These samples were then blotted onto filter paper where the DNA could be denatured and fixed, allowing it to be reused several times. Thus, the convergence of high-throughput techniques and gene expression profiling began to intersect.   

It would take another 10 years or longer before research groups began to utilize robotic systems to arrange DNA clones rapidly from microtiter plates onto filter paper. Furthermore, the push for projects to sequence the entire human genome was starting to escalate, which led to significant developments in DNA cloning technology that would help usher microarray technology into the new millennium.

Modernizing a Molecular Tool

Early array technology had done an excellent job of working out the methodology for isolating, hybridizing, and detecting genes of interest, and at the same time making substantial progress with high-throughput workflows. However, there was one main area of arrays that needed to be addressed—the “micro” portion. Shrinking arrays down to more microscopic levels would allow scientists to cram thousands of gene sequences into more manageable sizes and so cut sample input requirements, reagent usage, and overall costs.

In 1995 Stanford researchers published a seminal paper in the journal Science (“Quantitative Monitoring of Gene Expression Patterns with a Complementary DNA Microarray”) introducing the first miniaturized microarray that printed samples from microtiter plates onto a standard glass microscope slide (3.5 mm × 5.5 mm). A year later molecular biologist and Howard Hughes Medical Institute investigator Joseph DeRisi, Ph.D., and a group of investigators at Stanford described a method that allowed higher-density arrays to be printed onto the same microscope slide area.

These newer arrays were also improved through their replacement of radiolabeled probes with fluorescently tagged oligonucleotides. The importance of using fluorescence detection cannot be overstated, because the technique is not only quite sensitive with a broad dynamic range, but it also allows for the labeling of two or more samples with different colors within the same array. This two-color approach enabled researchers to measure the ratio of signals on the same array, thereby creating a more reproducible technique.  

It would be next to impossible to address the topic of microarrays without speaking about Affymetrix, a name that has become synonymous with modern microarray technology. By the mid-1990s, Affymetrix had created technology that allowed DNA sequences to be synthesized directly onto the array surface. This provided an advantage over spotted arrays by requiring only a small collection of compounds for construction. By 1994, the company released its first major product—the human immunodeficiency virus (HIV) genotyping GeneChip®—in which it also incorporated the use of semiconductor manufacturing techniques. Although some other unique approaches have been created over the years to improve array technology, such as combining inkjet printing with oligonucleotide synthesis chemistry and self-assembled arrays, the GeneChip still remains the industry standard.

Keep On Keepin’ On

Microarrays have expanded far beyond their early uses as simple gene expression profiling tools and now have applications that span the genomic gamut. While broad-based arrays are still attractive, such as those designed to look at genome-wide single-nucleotide polymorphisms (SNPs), biotech companies are creating focused arrays that contain subsets of genes known to be involved in various human diseases. These targeted approaches allow clinical researchers to use multiple microarrays as molecular diagnostic tests for a number of different disorders. For instance, in 2014 the FDA approved Affymetrix’s CytoScan® Dx Assay, which was designed as a whole-genome, postnatal blood test to aid physicians in identifying the underlying genetic cause of developmental delay, intellectual disability, congenital anomalies, or dysmorphic features in children.

The greatest challenge the microarray industry faces is from the bourgeoning next-generation sequencing (NGS) market. Although NGS technology has some distinct advantages over microarrays, many researchers still depend on arrays’ proven, validated technology over the generation of NGS data. Because most scientists are concerned with the finances of their laboratory, microarrays come out on top in this category in comparison to the expenditures required for NGS analysis methods.

A quick perusal of the scientific and pop-sci literature will ultimately yield a number of articles with a relatively regular periodicity predicting the ultimate demise of microarrays, often to be replaced by some newer, sexier technology. Yet, here we are in 2016, assembling a 40-year or so retrospective. Seemingly the sky is still the limit for the technology that had some very simple beginnings and blossomed into a fundamental laboratory tool.


 

Previous articleMicroRNAs Identified as Novel Targets for Treating Asthma
Next articleStudy Shows Genetic Influences on Individual Immune Responses to Pathogens