Oliver Leven Ph.D. Head of Professional Services Genedata
Serendipitous pharmaceutical discovery is making a comeback through high-content screening.
In the early days of pharmaceutical research, discoveries were often made by chance. Take for example the discovery of penicillin by Fleming in the 1930s. Fleming observed that a certain mold stopped the growth of bacteria, and chemists identified penicillin as the active chemical substance successful in fighting bacteria. Penicillin was soon mass-manufactured and became the hallmark drug of the emerging pharmaceutical industry. Fleming’s research approach followed a screening paradigm: Have a defined condition of a biological model and systematically search for substances changing this condition.
By the late 20th century with the rise of combinatorial chemistry, molecular biology, genomic and pharmaceutical sciences, scientists expected to overcome serendipitous discoveries such as Fleming’s. There was a very clear expectation that the “secrets of life” would soon be understood and could be translated into targeted research toward new drugs. However, the promise of biochemical pharmaceutical sciences has not been completely fulfilled. And, the current trend toward phenotypic screening is a return in some ways to past concepts of serendipitous pharmaceutical discovery.
From a pure chemical mechanics perspective, molecular biology explores how proteins interact with each other and how chemical compounds affect their function by inhibition or stimulation. Together with the genomic sciences, molecular biology enables the modification and production of proteins at high amounts. These two key technologies allowed scientists to screen more chemical substances at a much higher throughput compared to the animal-based studies conducted predominantly in the 1960s and 1970s: In vitro tests to find compounds inhibiting an enzymatic target became an affordable methodology and laid the foundation for high-throughput screening.
Three important elements were required to realize the value of high-throughput screening: (1) the microtiter plate in SBS standard format allowed integration, automation, and parallelization in the screening process with compatibility of pipetting devices, plate-handling robots, and detection instruments; (2) improvements in mechanics and electronics allowing more compact and elaborate devices; and (3) the provision and maintenance of large, ever-increasing compound libraries and the associated storage infrastructure brought about the elaborate logistics required in the process.
The improvements in cell biology allowed the preparation of cell cultures with defined properties, supplementing enzymatic assays with cell-based assays for targets that could not be examined without living cells in the wells. The number of targets of interest for the pharmaceutical companies was drastically increased with the human genome project, leading to an increase in screening activities.
Phenotypic Screening Gives the Big Picture
High-content screening (as in automated microscopy with automated image analysis) was initially developed in the late 1990s for screening those targets that were difficult to come by with more traditional detection methods. However, while it enabled certain target-based screens, it directly provided additional insights beyond the simple quantified effect, such as toxicity or unexpected changes of cellular phenotype. While classical screens with a single readout only allow you to quantify the measured effect, high-content screens promise more: Additional phenotypic changes can be observed and also quantified, allowing in principle a distinction of the chemical responses into three classes: similar to untreated cells; similar to control-treated cells; or dissimilar to both.
By definition, high-content screening thus is phenotypic screening, as the cells’ phenotypes are quantified. It represents the classical biological research: Change condition and observe the outcome, albeit at a new level of scale. Also, it allows automated examination of more complex subjects than cells, such as whole organisms, tissues or even the formation of tissues and cellular interactions over time. And finally, it inherited from high-throughput screening the quality control and standardization requirements for all screening experiments: the experiment must be standardized to optimize reproducibility at defined conditions—typical for treated and untreated cells.
Besides its place in target-based and phenotypic screening, high-content screening is still the biologist’s workhorse in genetic screening: All genes of the organism under investigation (e.g. yeast) are systematically silenced and the effect on this “gene knockout” on the phenotype is observed using image-based high-content screening. Similarly, high-content screening is applied in compound combination studies to measure the effect of combined substances on model organisms.
While there is a significant intersection between phenotypic and high-content screening, there are distinctions. For instance, label-free screening by biosensors is an alternative technology, which often can detect the smallest physiological changes in organisms, such as the change of the center of mass in a cell without biochemical or genetic engineering of the cells (as would be needed to, e.g., apply an antibody-bound dye or introduce a receptor).
Nevertheless, high-content screening is the primary tool for all phenotypic screening today, and most new chemical entities have been discovered by phenotypic screening. This does not, however, preclude target-based screening from the drug discovery process. Target-based screening yields more optimized entities quicker and allows a more easy and efficient optimization of a compound toward a target. Additionally, the FDA requires the target of a new drug to be identified and proven before it can be released to the market.
All the advances in natural sciences have led to a sophisticated set of tools and instruments to segment and understand the effect of a given substance on model organisms. Successful phenotypic screening increases the mechanistic understanding of the underlying biology. With new experimental technologies, higher throughput, and advanced diagnostic tools, researchers can observe phenotypes and make a quantum leap forward in more relevant disease models.
Oliver Leven, Ph.D., is head of professional services for the Genedata Screener Business Unit at Genedata.