January 15, 2005 (Vol. 25, No. 2)

Lab Automation Cuts Both Ways

Comparing lab automation in drug discovery to the tools used in the gold rush may not be so far fetched. “At first people panned for gold. Surface mining provided brute force, thus increasing the throughput, but the efficiency wasn’t much better. Geological mining improved the miners’ ability to find the actual gold,” says Kevin Hrusovsky, CEO, Caliper Life Sciences (www. calipertech.com).

According to Hrusovsky, drug discovery is at that juncture now. The era of relying solely on high throughput, brute-force testing has passed, and companies are now focused on higher quality experimentation. If drug companies can profile the selectivity of compounds much sooner and more precisely, they should be able to discover safe drugs faster.

Has the promise of lab automation lived up to its potential in helping drug companies to achieve their objectives?

“Laboratory automation is being transformed by technologies like microfluidics, which provides screening capabilities that also serve the emerging need for higher quality data,” explains Hrusovsky. “However, transformative technologies are by definition disruptive, and it takes time to revolutionize an entire industry.”

John Gerace, director of automated solutions for the biomedical research division at Beckman Coulter (www.beckman.com), explains that “lab automation has been used to impact quality of experimental results, providing better accuracy and precision. This ultimately translates to more knowledge but not necessarily more throughput.”

“The expectation was that by automating a part of the drug discovery process more drugs would be developed in a shorter period of time,” says Chris McNary, vp/general manager of laboratory automation solutions at Thermo Electron (www. thermo.com).

“It was assumed that if you automate high throughput screening the entire discovery process would automatically shrink. The reality is that it just exposed other highly manual processes that became the next process bottlenecks.”

The bottleneck keeps moving to a “different spot where the capability for automation exists, but has not been enabled,” continues McNary.

“ADME is a good example. As a company starts to develop drug leads, it hits a manual process bottleneck. Thus, automation can increase throughput and quality and improve the overall ADME process management.”

1,536-Well Technology

Gerace believes that the key to success for lab automation in drug discovery is focusing on solutions, rather than generic tools. “For instance, the highly touted 1,536-well technology of a few years ago has its place, but there is a greater return on investment in low-volume 384-well technology at the moment,” he explains.

“While there is a need for more sophisticated dispensing and detection technology with 1,536 (wells), there have been big gains in decreasing well volumes from 96- to 384-well plates while using existing detectors.”

Jeff Hurwitz, director of robotics at Hamilton (www.hamiltoncompany.com), believes that cost and performance issues, such as the nanopipetting of samples, have hindered the use of 1,536-well assays for drug discovery. In addition, the standard inexpensive detectors, probes, reagents, and readers are inadequate for this format of screening.

Only 10 of today’s drug discovery tests are run in 1,536-well microtiter plates, says Hrusovsky. Miniaturizing microtiter plates increases throughput and reduces cost per test, but drug discovery companies have to overcome some hurdles in liquid handling for homogeneous assays.

When artifacts are created by fluorescence interference, assays produce false positives that hurt productivity. Traditional separation steps to confirm the data may not be effective.

On Your Own vs. Getting Help

Gerace points out that the big, expensive solutions for drug discovery have given way to off-the-shelf solutions that can be customized. Instead of using a dedicated programmer to develop a solution in months, these systems can be taken out of the box and be up and running in days.

Others, however, still might need some help. According to McNary, “Customers are realizing that they can’t do it alone. They want providers who can come in and do the whole lab automation scheme. They’re looking for a single vendor with a broad product portfolio to act as process consultants and as a single point of contact from lab construction through ongoing applications support.”

Caprion (www.caprion.com), a drug discovery and development company applying its proteomics technologies to create pharmaceutical products for cancer and infectious disease, uses lab automation for plasma profiling in blood samples from clinical trials. The objective is to look for pharmacodynamics.

Gregory Opiteck, Ph.D., director of the department of protein analysis at Caprion, points out that, “We couldn’t do the routine, mundane jobs or get the level of reproducibility without robots.”

Although some pharmaceutical companies have hired entire departments to build robots, the plasma profiling field can take off-the-shelf solutions developed for combinatorial chemistry and high throughput screening and adapt them to its purposes. “I’d rather spend the time and money finding biomarkers than reinventing the wheel,” Dr. Orpiteck adds.

Greg Hollis, Ph.D., vp of applied technology at Incyte (www.incyte. com), believes that automated pipetting tools “allow scientists to exercise their gift for science, rather than doing mundane tasks.” They save time, allow drug discovery companies to do more, move faster, minimize ergonomic problems, and provide higher accuracy.

“Liquid-handling tools can screen many samples at a time,” says Dr. Hollis. “If a company has 300,000 compounds in a high throughput screening library, it can move from stock solutions to plates and screen 50,000 to 100,000 samples per day. That means a lot of people or one pipetting machine.”

Dr. Hollis finds lab automation powerful because he uses it in focused ways. Rather than automate a whole process, his lab does a specific task and then moves to another piece of equipment.

Balancing Productivity and Cost

As a mid-throughput pharma company, Isis (www.isip.com) sees its challenge as a way to balance productivity with the cost of equipment, physical space, and human equity.

“We’re most productive when we have the ability to perform assays 24/7 and run them unattended,” explains Rich Griffey, Ph.D., vp of chemistry, Isis. “We get the most out of the available time, leverage human resources, have the positive ability to find hits for particular targets and, thus, a higher probability of leads that can be turned into drug candidates.”

Jim Keck, vp of biology research at Telik (www.telik.com), claims that high throughput tools have enabled his company’s underlying technology. Telik, a biopharmaceutical company focusing on small molecule drugs to treat cancer and other serious diseases, uses a proprietary drug discovery technology, TRAP (Target-Related Affinity Profiling), for the discovery of small molecule drug candidates.

TRAP takes advantage of the fundamental principle that all pharmaceutically active molecules work by interacting with a protein target. Telik has several products in clinical trials; all of them have been developed internally using TRAP.

Varied Appoaches

Each drug discovery company has a different idea of screening, thus making it hard for an automation company to come up with standard systems, points out Hurwitz. Integrated systems and workstations have to be open and flexible, and platforms have to work in many areas in drug discovery, genomics, and proteomics labs.

The closest to a standardized procedure is sample extraction using industry standard kits, notes Hurwitz. Drug discovery companies may do from 200 to 2,000 plates per day, and every company has a different vision of what constitutes stable compound storage.

McNary points out that current market dynamics include a change from large production facilities to smaller islands of automation with integrated focused libraries enabling real-time screening against more specific targets. More targeted screening campaigns will be done on a smaller scale, with modular flexible systems capable of performing a variety of discovery functions.

“The key is the use of cells, performing analyses in vivo instead of in vitro, as near to the native state of the biological human being,” he says.

According to Griffey, there are major storage problems. Not all assays are suited to automation. A large investment is required to get assays into a format where they can be automated.

“For low-cost users, there is a big jump to dispense liquid in very small volumes, so they stay with 96-well formats. It’s hard to mix and match old and new equipment,” he adds.

Handling the Data

Mark Harnois, senior product marketing manager at Waters (www.waters.com), notes that, “Technologically, analytical instruments are providing more information, which has to be captured, catalogued, and stored. We’re seeing the benefits of the informatics part of lab automation, but we haven’t seen all the benefits yet.

“Systems with open architecture can capture information from all systems into a central repository. How much is adopted’ is the question. We’re just starting to see the benefits of data interpretation and screening. We need to have more industry standards so that when new products generate information, they are vendor-neutral and allow scientists to move from one tool to another.”

“Every automation vendor has different software,” adds Hurwitz. “Some software is becoming similar, so users can interface. We need software to drive systems to be easier to use.”

Where to Go from Here

Dr. Hollis believes that there will be a continuing progression of tools as novel technologies need to be automated. New trends, according to Gerace, include miniaturization (nanoliter fluid transfers), multiplexed assays, sample and target preparation on chips, automation for preclinical and clinical trials, molecular diagnostics, and simplifying and automating bioassays regarding immune response.

Hrusovsky anticipates a move toward molecular diagnostics, where panel testing of the population will provide a genomic profile to help physicians prescribe medicine. He also expects microfluidics to help obtain high quality data and correct problems by further miniaturization.

Hurwitz says that the equipment used for high throughput screening is getting much smaller. “The cost of components is the driving issue on miniaturization. Key issues are stability, throughput, and miniaturization versus cost.”

In five years, lab automation systems will be more intelligent and flexible, notes McNary.

“There will be a higher probability of success with modular flexible systems that totally integrate the discovery process and incorporate technologies such as microfluidics, on demand screening, cellular imaging, and in silico predictive modeling,” he says.

“Lab automation has been associated with analytical instruments,” adds Harnois. “Now we have to take paper notebooks and make them available electronically for patent defense and make behavioral changes about shared information.

“We need to be able to witness and document information, share in collaborations, and incorporate additional knowledge from the IT side to complement analytical instruments to provide solutions that manage information.”

“In five years, the quality of the information will be absolutely paramount,” maintains Hrusov-sky. “We have to be careful that we’re not creating high throughput confusion.”

Previous articleClinical Data at ASH Boon to Wall St. Analysis
Next articleNIH Roadmap Home Page