|Send to printer »|
Feature Articles : Jan 15, 2009 ( )
Automation Drives Laboratory Economics
Traditional Hands-On Scientists Discover Expediency and Utility of Enabling Technology!--h2>
When automation technology first became available for drug discovery, the equipment was expensive and fairly complex to use, and needed a lot of attention and maintenance to stay up-and-running. This situation limited applications to high-throughput screens in large organizations.
Lab automation tools are now more reliable and easier to use, making them accessible, not only for super-high-content compound screening, but also for more speculative applications, even for academic and basic research. Scientists are now able to use automated tools in novel ways—even in the earliest stages of assay development—and, at times, the adaptability of the technology may surpass that of the scientists using it. Automated tools bring exponentially greater efficiency, while changing many of the fundamental paradigms of drug discovery research.
“LabAutomation 2009”, to be held in California later this month, will highlight new developments and applications of laboratory automation tools. Many of the presentations have exciting implications for drug discovery. A consistent theme is how to adapt the workflow to make the most of the automation, even when that workflow is radically different from what most people are accustomed to.
For example, one of the most surprising applications to be showcased at the conference has to do with using a robot to take aliquots from frozen samples, without thawing the samples. Dale Larson, director of biomedical engineering at Draper Laboratory, presents positive results of an ongoing project, to develop a “mini ice core” method for sampling frozen tubes of blood or other fluids.
The origin of the project involves a refusal of one scientist to share samples with another, because “your project isn’t important enough to thaw my samples.” This mean-spirited refusal to share inspired Larson to investigate a way to remove aliquots from frozen samples, without exposing those samples to the rigors of the thaw and refreeze cycle.
Draper’s sampling technique actually resembles the ice-coring methods used by geologists to sample layers of ice in a glacier. A hollow needle with a cutting surface drills into the sample. When the needle is withdrawn, there is a core of sample inside the needle, which is ejected into an empty tube by a piston.
Although the concept is simple, there were some complications to overcome. Larson’s group needed to learn how to drill into the sample without creating fractures, and to drill the full depth of the tube to sample the full biochemical composition of the liquid. “We can’t have chips and fractures in the ice left behind,” says Larson “so, we had to come up with conditions that allowed us to drill in and leave behind a clean sample.”
The method can remove five samples from a 1.8 mL tube before the sample must be thawed and refrozen—that’s about 500 uL, or roughly one-third of the sample. Although most labs will want to keep track of the number of times that the tube has been sampled, a nice feature of the system is that it can sense the regions that have been cored independent of the informatics system, and report back to the operator if there is no more room to take aliquots.
Getting the Biology Right
Another new frontier in automation is an area of research that had been difficult-to-impossible to automate in the past: cell-based assays. Cell-based screening has a number of advantages over biochemical screens, the main one being that it is a close representation of the biology of the environment in which the final drug will be used. With older-generation automated systems using large robotic liquid-handling systems, it was difficult to model cellular biology within the assay. Now, a number of systems are coming online that incorporate more biologically appropriate assay conditions.
Odyssey Thera has developed a high-throughput platform that tracks subcellular protein complexes using a system of automated microscopes and unique cellular probes. According to the company, the strength of its platform lies in the combination of automation and systems biology, and the large number of molecules that comprise Odyssey’s database. The system is able to use image analysis and a proprietary IT infrastructure to process the assays at a rate of millions of wells per run, reports John Westwick, Ph.D., president and CEO.
“The majority of our day-to-day effort entails using the platform to look at test agents and candidates from our big pharma partners. We know what successful and failed or toxic drugs drugs look like. We can look at hundreds of thousands of drug candidates and flag those that might have safety issues.”
In addition to toxicology prescreening, the platform has great potential for drug repurposing and reindication, Dr. Westwick adds. By running known agents across the large panel of cellular assays, new therapeutic indications can be identified by comparing signatures to the pathway and systems biology information in the database. “We have been able to identify mechanistic signatures, and redraw many of the cellular signaling pathways,” he notes. “Most important, perhaps, are the connections we’ve identified between known pathways—in the long term that may be our most important contribution.”
Fabrus has a new take on some older technology—a way of expressing antibody Fab fragments in bacteria more efficiently. It has expanded on that method and adapted it to a system that makes antibody Fab production and screening a little more like small molecule screening. It’s a way to bridge or adapt biologics research to the more common paradigms and technologies that exist within drug discovery, the company reports. Using the system, Fabrus has created a library based on human antibody sequences, and which are individually addressed in 384-well plates.
Fabrus is using an automation solution called Piccolo from The Automation Partnership. This room-sized instrument enables it to express and purify up to 576 samples per week with high yields and an average of 50 ug of protein per sample, with just one operator.
Most antibody screening assays are based on binding rather than any functional activity of the antibody, and through a process of mutation and maturation they will produce a final antibody, and then begin functional studies. Fabrus has worked around that multistep process by using naïve antibodies, not raised to any specific antigen, and then screening them just like any small molecule screen with the goal of producing a therapeutic antibody.
It’s been traditionally difficult to conduct high-throughput screens under natural biological conditions such as physiological temperatures. For this reason, Bernhard Becker, a research assistant at Technical University in Munich (TU; portal.mytum.de), has been working on a high-content screening system based on living cells that is small enough to fit inside a temperature-controlled chamber.
The system is primarily intended for chemosensitivity testing, to determine the most effective medication for cancer patients. The system measures the metabolism and morphology of the cells through pH, pO2, and impedence values, supplemented by imaging.
Joachim Wiest, an engineer with Cellasys, which is working with TU on the prototype, explains that “the biggest problem to get these systems to run is to solve interdisciplinary problems—to get all this knowledge together. Maybe you can find a good engineer or biologist, but it’s hard to find a good bioengineer.”
An interesting wrinkle in the development of the system is that, although it includes microscopic images, these are technically not necessary to assess the morphology of cells. The impedence measurement gives information about morphology. However, the microscopy is an element that most physicians are not ready to part with when it comes to cell morphology, and it is a useful means of validation. “Most doctors are more used to seeing a microscopic image than a metabolic rate,” says Becker. “At the moment we have to use it. If you want to get the system in the hospitals, you will have to work with the doctors first.”
Under New Management
High-throughput screening affects more than just the assays in the main development pipeline. Supportive tools and services can also be affected.
Meeting the demand for compounds for high-throughput screening is a challenge that requires some innovative, high- throughput solutions of its own. Mike Stock, group leader in compound management operations at Biofocus DPI (www.biofocus.com), talks about what it’s like to do compound management for the National Chemical Genomics Center (NCGC, part of the NIH), one of ten centers in the molecular library screening network.
At the beginning of the project, he says, there was just a small repository, and they were starting from scratch. It was an opportunity to customize the repository and the compound-management system needs from the ground up to service a high-throughput screening facility.
“We had to design and build a process without having end users to talk to,” adds Stock. The first user contact it had was with the NCGC and it was, therefore, heavily influential in the shape of the final system.
Among the most important considerations in compound management is the volume of compound solution to maintain and to aliquot for each screen. BioFocus settled on 45 uL of 10 mmol DMSO (dimethyl sulfoxide) solution per year.
Another factor was the storage of the tubes. BioFocus opted to use a system that uncaps and recaps, instead of a system with a septa that is penetrated by a needle. To counteract the effects of air on the solution, it handles the tubes in a nitrogen atmosphere, which has proven to be a uniquely successful strategy for compound management.
DMSO is hygroscopic, and each exposure to air results in water getting into the solution and possibly changing the solubility of the compound. “One of the things we did agree to, as part of the NIH collaboration, was to do an annual maintenance quality control check,” Stock points out. “This is where we do the statistical sampling of the compounds kept as solutions. We agreed to check them, and if they weren’t any good, we would get rid of them and start over. We have two year-old solutions that we’ve handled quite often and the QC is good. We haven’t had any stored DMSO solutions fail our maintenance QC.”
For comparison, many pharmaceutical companies have had no QC on their stored DMSO compounds, but some of them have found that degradation in stored compounds has resulted in problems with hit confirmation and follow-up within high-throughput screens. BioFocus is also piloting single-use plates to send to the screening facility upon request.
For many scientists, this is a thought paradigm shift, especially those who like to keep their compounds and other stored supplies close by for peace of mind; however, they are slowly coming to realize that for the sake of the integrity of the stored compounds it may be better to let go and let a specialized facility handle them.
On one hand, researchers will have to let go of older ways of doing things in order to make the fullest use of automated high-throughput tools; the benefits certainly outweigh the discomfort of making the change. On the other hand, the costs are not so easily dismissed.
Labs have dealt with this in various ways, such as sharing high-throughput resources like the NIH Screening Centers Network, or by teaming up with a larger company to share a piece of advanced equipment. Many labs are building their own prototypes and designing their own systems. It’s an exciting time to be getting started with high-throughput automation.
© 2013 Genetic Engineering & Biotechnology News, All Rights Reserved