March 15, 2005 (Vol. 25, No. 6)
Enabling Good Science by Optimizing Planning and Execution of Projects
One of the highlights of “Lab Automation 2005” was the discussion of such topics as planning and executing successful automation projects, the strategies and technical features that make up a successful automated system, up and downstream impacts of lab automation, strategies and tools for managing data from automated systems, and current and future lab automation technologies.
“Technology is whatever you can harness to improve the process,” said Steven D. Hamilton, Ph.D., of Sanitas Consulting (Boulder, CO). “Technology should drive down the cost, shorten the process time, improve productivity and quality, and encourage innovation. While early adoption may give your organization a competitive advantage, late adoption will allow your organization to learn from others’ mistakes.”
Pharmaceutical R&D is the largest adopter of lab automation, according to Dr. Hamilton. The lab automation market is estimated to be nearly $2 billion annually. During the past decade, drug companies have spent more than $40 billion on technology-related collaborations and acquisitions.
However, the automation effort has resulted in few safe and efficacious drugs. While automation can enable good science, it cannot create it, according to Dr. Hamilton.
“The impact of technology on the balance and flow of the entire process must be understood, planned, and managed. The winning companies effectively integrate technology with science using both in-house resources and external partnerships.”
Asking the Right Questions
Dr. Hamilton emphasized the importance of enterprise optimization. While more data can be generated faster, that capability cannot replace the process of asking and answering the proper scientific questions. The data explosion must be accompanied by increasingly optimized processes for knowledge integration, information flow, and decision-making.
Dr. Hamilton recommended understanding lab automation, choosing the right projects and technology, developing a strategy and plan, finding the resources, managing the project, and implementing, validating, and preparing for long-term operation.
He suggested evaluating the bottlenecks by figuring out the “takt” time, the rate at which parts must be produced in order to satisfy demand. This exercise is the heartbeat of the industrial process.
The laboratory unit operation (LUO) is the basis of laboratory architecture. It can include sample transport, sample processing, and data handling, according to Dr. Hamilton. A task such as a simple enzyme reaction is a collection of LUOs.
Integrated System Model
In the early 80s, robotics-centric models of laboratory automation transported samples and performed many processing LUOs. They were not very efficient. The workstation model, which performed an automated function involving a limited group of LUOs, was efficient but limited.
“Now the robot-centric model is a thing of the past, and the integrated system modela collection of devices that perform an automated process involving a large number of LUOs, often using multiple workstations serviced by a general-purpose transport systemis what we see in the laboratory,” Dr. Hamilton said.
Gary W. Kramer, Ph.D., of the National Institute of Standards and Technology (Gaithersburg, MD), explained that reliability is the most important asset of a system. “Increased complexity can mean decreased reliability,” he added.
“You can avoid errors by building error sensing into system devices, using only automation-grade’ disposables, minimizing the use of unproven custom devices, adding sensors to critical steps, confirming critical operations, assuring by weight instead of volume, avoiding anthropomorphic solutions, not skimping on verification steps, and continually testing.”
Dr. Kramer recommended building performance metric monitoring into every system. It involves performing standard tests on automated systems at regular intervals: capturing, storing, and analyzing results automatically, tracking critical information with charts, sending immediate automatic e-mail notifications on out-of-spec performance or other errors, and generating and automatically distributing periodic summary reports.
Very little has been standardized in terms of lab automation integration, according to Dr. Kramer. The Society for Biomolecular Screening (SBS) succeeded in developing standards involving 96-well plates, but that standard took eight years to complete.
Other standards have been developed for specific instruments, such as high-performance liquid chromatographs, mass spectrometers, nuclear magnetic resonance spectrometers, infrared spectrophotometers, and ultraviolet spectrophotometers.
In other words, most standards are successful only in very narrow areas. Economic drivers for broad standards are not strong in a small volume, highly fragmented market.
Mark F. Russo, Ph.D., of Bristol-Myers Squibb Discovery Technologies (Princeton, NJ), addressed the topics of programming automation and data management. He described commercial data management systems and the advantages of buying versus building such a system. He compared application-specific data systems to multipurpose data systems.
LIMS
Dr. Russo described LIMS in great detail. Such a system, he said, is designed to organize data produced by a laboratory, allow for easy search and recall, assist with scheduling lab activities, generate useful reports and track measurements (results, limits, errors, and dates), activities (methods, revisions, and schedules), personnel (role, restrictions, and qualifications), and products (name, type, cost, hazards, location, and age).
While LIMS can benefit laboratories greatly, these systems fail half or more of the time, Dr. Russo said. Reasons include not thoroughly understanding the process, starting with a problem that is too big, not having the ability to customize easily, poor identification and sharing of goals, inadequate financial and staff resources, clients not being involved from the beginning, resistance to change, lack of management commitment, and no plan to manage the changeover.
Dr. Russo also detailed the benefits of archiving data in standard formats, which make unrestricted data manipulation possible, provide the ability to retrieve the raw data long after the instrument is gone, and allow the flexibility to display and analyze data using the most appropriate software tool.
Dr. Hamilton talked about lab automation trends over the past several years. “Automation has matured, providing more reliable and sophisticated, but also more expensive systems. The number of technology providers has mushroomed, leading to increased consolidation.
“Early fascination with ever-increasing throughput has shifted to an emphasis on improving the quality and relevance of data, leading to trends such as high-content screening and more targeted, druglike compound synthesis.”
Many organizations, according to Dr. Hamilton, find that their R&D structure is not suited to the “industrialized research factory” model. Bottlenecks can appear upstream or downstream, and restructuring has become popular.
Miniaturization has made strides, but not as fast as “overexpected,” he said. He detailed hot technologies, showing that some mature ones will still have a double-digit growth rate in the foreseeable future. He discussed some of the newer technologies, such as nano/picoliter liquid handling, noncontact transfer, and high content screening.
Pharmacogenomics and molecular diagnostics are generally considered to be the most significant growth areas for tools and technology.
“Pharmacogenomics and molecular diagnostics have many automated procedures in common,” Dr. Hamilton explained. “We’re seeing a shift from large automated systems derived from the Human Genome Project to workstation-scale automation.
“Recent advances include better purification directly from whole blood, real-time PCR (rtPCR) for simultaneous amplification and detection, and eventual linking of purification and rtPCR.