August 1, 2005 (Vol. 25, No. 14)

Transformative Developments Mark the Historical Journey in Drug Discovery

With a wealth of revolutionary technologies in hand, pharmaceutical developers today don’t leave much to chance. But as luck would have it, serendipity did play a large role in drug discovery in the past.

The story of warfarin, the most widely prescribed anticoagulant drug, illustrates how drugs were discovered way back then’and how technologies have changed our approach.

Frustrated by his herd’s decimation by spoiled sweet clover disease, a farmer went to the agricultural experiment station at the University of Wisconsin for help in 1933, where he met up with Karl Paul Link, Ph.D.

Seven years later, Dr. Link and his colleagues went on to identify dicumarol as the hemorrhagic agent of sweet clover. Warfarin was selected from a collection of dicumarol derivatives as a reliable rat poison in 1948, and it wasn’t until 1953 that human testing of warfarin began. Physicians were only comfortable with its safety after someone unsuccessfully attempted suicide with a 567-mg dose.

Although known to be a vitamin K antagonist, it wasn’t understood that warfarin targeted vitamin K epoxide reductase activity until the 1970s. And perhaps even more surprisingly, the gene encoding vitamin K epoxide reductase was only reported in 2004.

Indeed, a “number of examples can be found of compounds discovered a long time ago on the basis of animal behavioral effectswithout much, if any, data to substantiate the mechanism of action,” says Franois Roman, vp, R&D, Euroscreen (www.euroscreen.com).

“In contrast, compounds like Etanercept (Enbrel) or Infliximab (Remicade) are developed on the basis of their very precise mechanism of action (TNFa antibodies).”

There is no doubt that drug discovery has changed with the times, but spectacular technology development within the last 25 years has absolutely transformed the industry.

To get a better understanding of how the science and actual process of drug discovery has changed since 1980, GEN asked senior management at ten different technology-based companies about the major developments, advancements, and trends that have taken place over the last quarter century.

Genomic Revolution

“I think the most fundamental change is the shift from a forward-genetics approach to reverse genetics,” says William S. Marshall, Ph.D., executive vp, R&D at of Dharmacon (www.dharmacon. com), a company involved in the identification of the vitamin K epoxide reductase gene.

In the more classical forward-genetics approach, phenotype comes first, and gene comes later. With reverse genetics, you start with the gene, and determine the phenotype when the gene product is inactivated. The ability to do large-scale reverse genetics experiments became possible with the sequencing of the human genome, completed in 2000.

“The availability of the sequence of the human genome and of other genomes as a result of the tremendous contributions of many scientists worldwide is a very important factor in drug discovery today,” says Gary Dahl, Ph.D., president, Epicentre Biotechnologies (www. epicentre.com). “In the past, it was a more hit-and-miss process to find a potential new drug targetcertainly much less systematic and global.”

According to Dr. Dahl, “one important way that the human genome sequence is used in drug discovery is for gene expression profiling,” which is used for both target identification and validation, and for the analysis of the biological effects of drug candidates on cells.

Genome sequence information, combined with access to gene expression profiling tools and loss-of-function technologies, such as RNA interference, “opens the spectrum for people to examine many more targets for intervention,” says Dr. Marshall.

According to him, Dharmacon was the first company to release a comprehensive genome-wide collection of siRNAs that can be used “in concert with cell-based assays of gene function to rapidly identify new genes and validate that they’re important in disease.”

Combinatorial Chemistry

Not only does pharma now have a plethora of potential targets thanks to the genomics revolution, technologies such as combinatorial chemistry and in silico screening make it possible to explore an ever greater diversity of chemical structures.

“The substance libraries were more limited before combinatorial chemistry,” says Stefan Lfs, CSO of Biacore International (www.biacore.com). Twenty-five years ago, you “couldn’t work in high throughput format the way people are working” today.

“For a large majority of companies, the Structure Activity Relationship approach has been replaced by HTS technology, helped from one side by the automation of chemical synthesis, and on the other side, by the development of fast fluorescence detectors,” says Roberto Pasqualini, Ph.D., scientific advisor, development, CIS bio International (www.cisbiointernational.fr), a subsidiary of Schering.

“The solid-phase synthesizers have permitted the synthesis of large amounts of randomized’ molecules whose affinity toward a receptor may be measured by HTS instruments. The establishment of such technologies was linked to the concomitant development of high-speed computers.”

Computational Biology and Chemistry

The tremendous innovation in computational sciences supported the genomics revolution and advances in combinatorial chemistry, but also fundamentally changed the way pharmaceutical researchers completed mundane, every day tasks.

“Twenty-five years ago, scientists had to document the results of many experiments manually, which, of course, significantly impeded the ability to generate a lot of drug leads,” says Peter Coggins, Ph.D., president of PerkinElmer Life & Analytical Sciences (www.perkinelmer.com). “Documenting and evaluating the results of lab and drug discovery experiments” consumed significant time and money.

“The incorporation of the PC in daily life of drug discovery is now a fact of life,” says David Edwards, director of computational biology at Accelrys (www.accelrys.com), a provider of computational science and informatics software.

According to Edwards, many of today’s current computational methods trace their earliest roots back to the early 1980s. “Tools were just emerging” in the area of 3-D modeling, and “people were starting to develop algorithms for the analysis of protein and DNA sequences.”

“We have a product called Accelrys GCG. It came out of the University of Wisconsin at Madison, and was the first set of algorithms for protein and DNA sequence analysis back in 1980,” says Edwards. “Now the product has over 140 algorithms, things like BLAST and ClustalW. Things that people use on a daily basis.”

Computational biology is now routinely used for analysis of protein pathways, 3-D protein modeling, virtual libraries, and in silico screening. Twenty-five years has resulted in the ability to discover leads “faster, better, and cheaper,” he adds.

Advances in automation, LIMS, and informatics allow pharma to “interpret and act upon the results of lab activity much more rapidly,” Dr. Coggins agrees.

As an example, Edwards explains that Biogen Idec (www.biogen.com) employed Accelrys’ in silico screening software to identify a lead compound for a new cancer target, TGF kinase. The process took only about two months, significantly less time than it took another group that independently discovered the same lead through traditional methods.

High Throughput/Automation Technologies

With today’s high throughput technologies, even the wet lab approaches are fast, especially relative to 25 years ago. “Drug discovery went from doing experiments the old way to a lot of high throughput technologies, automation that allows scientists to do experimentation at a faster and faster pace,” says Andrea W. Chow, Ph.D., vp, microfluidics R&D, Caliper Life Sciences (www. caliperls.com).

Dr. Chow points to a trend of using higher and higher degrees of automation, 96, 384, and even 1,536 well plates, all the while trying to miniaturize usage of reagents and explore much greater chemical diversity.

“The mentality has been that doing more is better, faster is better,” she says, adding that while the current industry target is to screen 100,000 compounds a day, scientists are starting to question whether sheer throughput is the answer.

Advances in mass spectrometry, proteomics, and systems biology have also propelled long-lasting changes in how pharmaceutical companies approach drug discovery. The growth of the biomarkers industry is one key example.

Pharmacogenomics

Edwards points out that “Twenty-five years ago nobody had any idea” that medicines work differently on different people. “The way we did clinical trials analyzed results and populations, you might subset by male/female, but not this person has a particular set of genes.”

Today, “pharma is looking more toward cellular sciences and systems biology (proteomics, biomarkers) to help them in their pharmacogenomics activities,” according to Dr. Coggins. “The clinical diagnostic and research areas are beginning to merge, and it really all revolves around what we call molecular medicineusing the molecular signature to diagnose, design, and administer interventions.”

“The need for a biomarker is commonly recognized as a preliminary prerequisite when starting a clinical study,” says Roman. “The development of microarray technologies allowing the analysis of gene expression is one of the major changes with important impacts on the process. This technology is expected to bring radical improvements in terms of therapeutic development but also in diagnostic applications.”

“Biomarkers can be influential in every phase of drug development, from drug discovery and preclinical evaluations through each phase of clinical trials and into post-marketing studies,” agrees Dennis Gilbert, Ph.D., CSO, Applied Biosystems (www. appliedbiosystems.com).

Higher Standards

The availability of revolutionizing technologies may have inadvertently led to increased demands on the level of characterization of drug candidates.

“Previously, toxicity was discovered when a clinical trial patient had an adverse reaction. Now toxicity can be detected much earlier to rule out any potential harmful compounds in the drug discovery pipeline,” claims Dr. Gilbert.

Lfs says that the “higher regulatory demands in terms of toxicity and other side effects have changed the way pharma companies need to work. The need to validate drug candidates earlier slows down the process.”

“Twenty-five years ago it was still possible to develop a compound with the only evidence of efficacy in animal models in vivo. Today, the knowledge of the molecular target is necessary,” adds Roman.

There is “tremendous pressure to understand that drugs are effective and safe very early” in the drug discovery process, adds Dr. Marshall. If you don’t know the molecular target of a new drug, you don’t know its potential side effects downstream.

Soul Searching

Dr. Chow believes the industry has been doing “a lot of soul searching in the last five years,” thinking about why it is “spending a lot more money, with a greater number of technologies at our disposal, doing way more experiments than ever before, and yet hasn’t found more interesting molecules.”

Dr. Chow says that one high price of automation can be a decrease in data quality. You may have “more data but the data has a big error bar,” she says. “Now the pendulum has swung back. People want more validated targets, cleaner compound libraries, and higher quality data.” The current trend is to “back off on quantity and throughput a little bit.”

Echoing that sentiment, Edwards notes that, “one of the interesting things that has happened over the last five to eight years as the human genome project was nearing completion, people focused on large-scale analysis of the genome.” But more recently, there’s been a move toward “more hypothesis-driven research rather than whole genome” approaches.

“It has been to some extent disappointing that the number of approved new chemical entities has not been as expected,” says Lfs.

Even so, Lfs is confident the technologies will pay off eventually. “I would say that the decrease in the number of approved drugs per year is just temporary, an effect of the change in technologies. It takes some time before they pay off.”

Future

While many pharmaceutical companies have focused on the development of small molecule drugs, optimization of peptide drugs, and even RNA interference drugs may increase the potential of therapeutics even further.

Manufacturing efficiencies have lowered the cost of raw materials for peptides, and approaches to optimize biologicals by the addition of chemical modifications, such as PEG, have also proved beneficial.

“What has changed in twenty-five years is that optimized biologicals, including peptides, are now being introduced to the clinic in studies that seem to indicate that these entities have greater market potential than small molecules,” states Jim Hampton, vp, business development, American Peptide Co. (www.americanpeptide.com).

Dr. Marshall is hopeful that siRNA will eventually be useful as “therapeutic entities themselves,” as this would greatly expand the definition of a “druggable” target.

Continued innovation and change is expected in the future. “With such an explosion in knowledge and techniques, it is difficult to imagine what modern drug discovery will be like in another 25 years,” adds Dr. Dahl.

Previous articleHigh-Yield Single-Use Cell Culture Systems
Next articleCrop & Food Research Homepage