January 1, 2013 (Vol. 33, No. 1)

Just two decades ago we were only beginning to recognize the potential of automated technologies to enhance throughput in drug discovery research. Today, it is difficult to imagine a modern laboratory without robotic equipment.

“We see continuous trends toward increased reliability of robots, partially driven by the introduction of new types of motors,” reports Malcolm Crook, Ph.D., CTO, Process Analysis & Automation (PAA). “Another trend is toward smaller targeted systems, that still flexibly accommodate peripherals as needed.”

PAA will be presenting the “automate.it harmony,” its user interface for software integration, at the Society for Laboratory Automation and Screening (SLAS) meeting later this month.

Conventional small molecule discovery is driven by high-throughput screening (HTS) centers, complete with expensive robotics designed to move microplates between various stations for compound dispensing, assay dispensing, incubation, and optical screening.

The federally funded Molecular Libraries Program supports nine comprehensive screening centers, with the hub at the NIH Chemical Genomic Center—a large-scale ultra-high throughput operation capable of generating over 2 million datapoints per week. The NIH Center was set up to work on lesser known targets implicated in rare and neglected diseases.

“We routinely need to adapt our assays for targets with limited availability,” says Anton Simeonov, Ph.D., chief of the chemical genomics branch. “Working with a unique enzyme or primary cells from a patient with a rare disease calls for conservation of resources at each step.”

The center introduced multiple innovations in designing assays and in adapting them to their fully integrated GNF/Kalypsys robotic system. The system is capable of storing over 2.2 million compound samples, performing multiple assay steps in 1,536-well format, and measuring custom output signals. Three high-precision robotic arms circulate between peripheral units.

“Because we run dozens of protocols, each with its own sophisticated screening logic, our robotic engineers work closely with vendors to modify the source code in real time,” says Dr. Simeonov. The team fully redesigned the traditional dose-response screening protocol to conserve precious starting material. Instead of creating dilution series on the same plate, dilution series are created across plates, meaning that each plate had only one concentration of a given compound. This strategy helps to generate custom data curves and minimizes the potential errors stemming from liquid handling or equipment malfunction.

To further conserve the reagents, the center continuously develops companion assays of matching throughput. “The initial hits typically require confirmation by labor-intensive biophysical and biochemical counterscreens. We miniaturize secondary screens to run in a high-throughput format,” says Dr. Simeonov.

A screen for inhibitors of FEN1, a key DNA repair enzyme, combined a primary fluorogenic screen with a secondary chemiluminescent bead-based assay, both reliably deployed in 1,536-well format. The team continues to change the screening paradigm by adapting other technologies to HT format, including microscale thermophoresis and acoustic dispensing.

Distributed Compound Screening

“High-throughput screening is still a costly and risky endeavor,” says Brian M. Paegel, Ph.D., assistant professor, department of chemistry, the Scripps Research Institute. “Sophisticated robotic systems are expensive to run and maintain, and the compound library capacity does not scale.”

Dr. Paegel envisions a miniaturized, fully automated HTS instrument that can operate in any laboratory. “We are pursuing a new paradigm in compound screening,” he says. “Just like DNA sequencing evolved from being concentrated at a few large-scale sequencing centers to being available at the benchtop and by the bedside, HTS could also become distributed. Distributed sequencing resulted in an explosion of sequencing data of all sorts of organisms. Similarly, distributed HTS would radically transform the drug discovery field.”

To achieve this vision, Dr. Paegel’s team pursues a two-prong approach. First, the method for creating million-compound libraries uses prefabricated building blocks. The resulting compounds mimic natural products. The compound synthesis occurs on beads, and each step of the synthesis is coded in a DNA tag attached to the same bead.

“We are aiming to create a straightforward and robust synthesis process that can be simple enough to execute at the benchtop,” continues Dr. Paegel. “Next, the beads are screened in biological assays using our ultra-miniaturized and integrated microfluidic platform that supplants the need for robotics, microplates, and all of the associated costs.”

The bead library is manually pipetted into this microfluidic circuit, which then separates individual beads into picoliter droplets, already containing all assay reagents. If a compound on a bead produces a positive hit in an assay, the bead is separated from the rest, and the DNA tag is sequenced to determine the synthetic history of this compound.

“We see HTS campaigns of tomorrow to be as distributed and inexpensive as DNA sequencing is today,” concludes Dr. Paegel. The team is currently adapting this technology to screen for new antibiotics and antiretroviral agents.

Universal Language

Industrial laboratory operations depend on robotics to perform routine, repetitive tasks at large scale. But robotics may also be extremely useful in a more dynamic research environment—if only the robots could adjust to a new procedure on the fly.

“It may take longer to program a new robotic procedure via a manufacturer-provided graphical user interface than to go into a lab and pipette yourself manually,” comments Nathan J. Hillson, Ph.D., fuel synthesis division, Joint BioEnergy Institute.

“Moreover, protocols from a given robotics vendor cannot be simply transferred to a robot from another vendor. Because of that, robotics are largely under-utilized in basic and translational research laboratories.” If all experimental protocols were written in the same language, and this language was easily translated into instructions for all laboratory robots, it would provide a tremendous boost for academic science.

Gregory Linshiz, Ph.D., a postdoctoral fellow in the Hillson Synthetic Biology Lab, compares this idea with the way programming languages are used for consumer electronics. A general-purpose programming language, called “C”, maps efficiently to instructions accepted by multiple electronic devices. This capability propelled it to be one of the most widely used programming languages of all time.

“Using this analogy, one can imagine a programming language that maps to all lab automation devices,” continues Dr. Linshiz. “We developed PR-PR (Programming Robots; http://prpr.jbei.org) as an easy, user-friendly, high-level programming language that could interface with any existing robotic platform. Any biologist could learn to program basic procedures in PR-PR in under an hour.”

Furthermore, it is possible to further develop existing software packages to output PR-PR, thus enabling the interface between the software and robotics. The team tested a DNA assembly software (j5) for output of PCR-setup protocols into PR-PR scripts. It takes less than a minute to program a Tecan robot using the j5-generated PR-PR script. The team continues conversion of other protocols into PR-PR to enable greater sharing between research teams.

Toxicology Profiling

An automated microscope-based screening and computer-assisted image analysis approach enables measurements of multiple features at the cellular level at the same time, giving rise to the term high-content screening (HCS).

“Another distinct advantage of HCS,” comments O. Joseph Trask, head of cellular imaging core, The Hamner Institutes for Health Sciences, “is the ability to quantitatively measure phenotypic changes at the single-cell level following compound challenge. However, assay development still remains a challenge.”

In collaboration with Tim Wiltshire, Ph.D., at the University of North Carolina, the Hamner team led by Russell Thomas, Ph.D., used the HCS platform to identify potential cellular toxicity pathways. Primary embryo fibroblasts isolated from 32 genetically characterized inbred mouse strains were exposed in concentration response to a diverse set of environmental and pharmaceutical agents. Comparison of the cross-strain toxicity response pointed to candidate genes.

“To enable a true comparison between cells of different genetic origins, we had to figure out how to plate all 32 cell lines on the same 384-well plate and to make them grow to the same steady state,” says Dr. Wiltshire. This was achieved by a creative adaptation of the Thermo Fisher Multidrop instrument. The instrument’s eight input tube lines, typically delivering cells and media from a single vessel, were modified to uptake cells from eight different vessels, each containing a unique cell line. Four of these modules in tandem produced a 384-well plate with 12 wells per cell line.

The test compounds had to be added in small volumes to improve solubility in growth media and reduce toxicity of organic solvents. The team modified an existing Biomek pin tool liquid handler to deliver compounds in the 200 nanoliter range directly from the compound library. “This HCS screen produced a particularly complex set of data including dose-response measurements of four multiplexed cellular responses at two exposure time points in 32 different cell lines, requiring the development of new informatics tools,” says Dr. Thomas. The team has identified several interesting pathways and is working on a detailed biological evaluation of the key genetic targets.

Primary mouse embryonic fibroblast strain DBA/2J cells imaged using Olympus 40X/0.9NA air objective lens on the BD Pathway 435 High Content Imager. Cells were labeled with Hoechst 33342 (blue), YoYo-1 (green), MitoTracker Orange (red), and Cytochrome C – AlexaFluor 647 (cyan). [Hamner Institute]

Protein-Protein Interaction Targets

“High-content screening is, indeed, a powerful tool for studying the effects of compounds in living cells,” agrees Paul Johnston, Ph.D., research associate professor, School of Pharmacy, University of Pittsburgh.

“Introduction of high-throughput, high-resolution multichannel imaging devices integrated with powerful image analysis algorithms/modules made many complex analyses, such as protein-protein interactions, possible.” Interaction between proteins is essential for all cellular functions, and thus represents a novel class of therapeutic targets. Historically, protein-protein interactions (PPIs) have been difficult to tap for drug discovery.

“We now know that protein-binding interfaces are composed of discreet ‘hot spots’ of contact rather than one contiguous surface,” continues Dr. Johnston. “We can combine the power of structure-based drug design with high-content screening to improve success rate of finding small molecule PPI-disruptors.”

Dr. Johnston’s team implemented a semi-automated process to screen for disruptors of interaction between androgen receptor (AR) and TIF2 transcription initiation factor, which is thought to be important in prostate cancer. The process utilizes a positional biosensor method, originally commercialized by Cellumen, now a division of Apredica, a Cyprotex company.

Once set up, the pipeline can generate up to fifty 384 plates per day using standard liquid-handling and culture-dispensing robots. “As Dr. Hillson correctly noted, intricacy of frequent re-programming of laboratory robots is an impediment to a complete automation of our workflow,” says Dr. Johnston. “However, the process is readily scalable.”

Recombinant TIF2 encodes a localization signal directing it into the nucleolus, a compartment of a nucleus. AR can be stimulated to translocate into the nuclear compartment by treating cells with dihydrotestosterone. Co-localization of the proteins in the nucleolus is readily detected by imaging of fluorescent reporters tagged to each of the protein interaction partners. Initial screening of 1,700 compounds from the LOPAC set and NIH Clinical Collection library revealed several hits that either prevented the formation of AR-TIF2 complexes or disrupted already formed interactions. The team is gearing up to screen a much larger (>200,000) compound library.

Object-Oriented Workflows

“Chemspeed introduced numerous disruptive technologies in laboratory automation workflows,” says Michael Schneider, senior vp, business development. “Our Object Oriented Workflow Design is the foundation of all our product lines. By combining independent, functional Shuttles (sample carrier) and Objects (processing workstations) we created a new workflow paradigm that allows to process samples in parallel and sequential fashion. This flexible design supports practically unlimited variations of workflows.”

Each object is an independent “machine”, capable of receiving inputs, processing data, and sending outputs to other objects. Chemspeed’s modular objects are connected by a production-proven track system, carrying up to 100 shuttles. Each shuttle carries its own processes and interacts differently with “objects” along the track.

“This makes the workflow more robust,” continues Schneider. “If one shuttle fails for whatever reason, the others continue on.” The key to designing object-oriented workflows is to carefully identify all the required process steps, to connect them in a logical fashion, and to establish input and output parameters.

Another enabling key element of Chemspeed’s platforms is the flexible exchangeability of the more than 40 robotic tool features. Akin to a swiss-army knife, the unique robotic arm is able to accommodate substance handling, action, or analysis, including gravimetric dispensing, homogenization, ultrasonic dispersion, capping, and many others. These tools can be seamlessly exchanged during the run, providing additional flexibility to workflow design.

Pharmaceutical compound management fits perfectly in such workflow. The processes may include compound retrieval, removal of screw caps, adding of solvents, microdispensing, barcoding, quality assurance analysis, and return to storage. Chemspeed perfected gravimetric dispensing of solids, liquids, and viscous liquids.

Upcoming SWILE dispensing technology is specifically positioned for compound management. Chemspeed’s SWILE supports the “many-to-many” dispense mode and is capable of gravimetrically dispensing sub milligram quantities of compounds with a wide range of consistency, from viscous oils via waxy components to solids. Thanks to an automatically exchanged disposable all glass dispense “tip”, cross-contamination is completely eliminated.

Previous articleBiggest Biotech Trends of 2012
Next articleLook, No Label! Label-Free MST