December 1, 2006 (Vol. 26, No. 21)

Elizabeth Lipp

The 21st Century Microscope Sees Life More Dynamically

Some time before 1668, Antony van Leeuwenhoek learned to grind lenses and make simple microscopes. Scientists have been looking through different versions of this imaging system ever since.

There’s more to the 21st century microscope, however. The trajectory of life sciences over the last two decades has shaped the field of microscopy. “People are much less interested in static black and white images,” says Karl Garsha, head applications scientist, Photometrics (www.photomet.com).

“With the growth of live cell and tissue analysis, there is much more dynamic information for the scientist to gather in terms of wavelength, space, and time. So, as we’ve migrated back to imaging in cell biology, there had to be technologies capable of capturing quantitative data down the molecular level. Such approaches demand a higher degree of accuracy and precision than simple qualitative image capture. The requirements now, to collect an image from the microscope, are quite different from 20 years ago.”

Robert LaBelle, director of marketing, microscopy, Leica Microsystems (www.leica-microsystems.com) notes that, “over the last two decades, the microscope has moved from a static observation tool to a cellular workbench. This means that the microscope is incorporated into a larger experimental design that combines dynamic spectral imaging of molecular probes, lasers, software, digital imaging, and control over the cells environment.”

“Now that the sequences coding for all human proteins are known, modern approaches can be utilized to express them genetically in multiple colors and track them studying their dynamics in multiple dimensions,” says Stephen Ross, senior scientist and manager of product and technology, Nikon Instruments (www.nikonusa.com). “This requires microscopes that are multidimensional workstations imaging X, Y, Z, time, and spectral dimensions to elucidate these dynamic interactions.”

Technology has made microscopy much more interactive than looking at a static image, notes Nicolas George, manager of microscope systems at Olympus America (www.olympusamerica.com). “Not only do we have the capability of focusing deeper into a specimen with infrared and multiphoton technologies but we are able to remove much of the out-of-focus light coming from other areas of the specimen so we can more clearly view life processes as they occur far below the surface.”

Developments in fluorescent probes are a key advance. “These probes give us the ability to label different parts of a cell and to view changes as they occur in living tissue,” adds George. “Having specific fluorescence labels has been a huge step forward.”

The capabilities of image analysis tools have also changed greatly in the last 20 years, notes Angela Higgs, product manager at Zeiss (www.zeiss.com). “The microscope is no longer just a microscope—it is part of a system to efficiently view, document, analyze, and communicate.”

Emerging Trends

“Researchers drive our innovations—they want better resolution, brighter fluorescence, and better signal-to-noise ratios in a microscope that is easy to use,” Higgs says. “When it comes to lab equipment, we recognize that microscopes are a significant investment. The challenge is to incorporate changes in a modular fashion so existing microscopes can grow with the demand of research.”

The trends in microscopy fall into five areas, according to Garsha. “The big things are automation, which includes hardware control for data acquisition and brute force data processing; parallelization and probe multiplexing, which makes it possible to collect data on multiple scientific questions or variables in parallel; increased dimensionality, including 3-dimensional spatial acquisition, time-lapse, and multiple wavelengths; quantitation of image data; and development of specialized chemical and genetic probes to probe intracellular biochemistry.”

The automation and enhanced features of the microscope has been essential in its resurgence as a scientific tool. Rapid advances in computing power facilitate automated data collection, storage, and processing. “The needs of the researcher changed when they hit the end of the road where nonimaging wet lab techniques reached,” says Garsha. “Powerful molecular techniques have their uses, but the importance of image data has returned as an experimental tool. The practice of looking at intact cells and tissues is back. This time, people want to see live cells and tissues.”

“The trend has been toward beautifully integrated systems,” says Ross. “As these systems become considerably more complex, the great challenge is to tie all the components together with powerful software, making complex systems functionally simple for the end user.”

The biggest revolution in microscopy came with the advent of live cell culture and examination. “Biological research has dramatically progressed the investigation into the structure and function of the living cell,” notes Will Rogers, vp of applications and technology development, Leica Microsystems. “High-speed spectral confocal microscopy has enabled the high-resolution imaging of subcellular events, and emerging super-resolution technologies will allow the nanoscale investigation of endogenous nuclear proteins.”

Garsha concurs saying, “the overall big picture is that we are moving to live cell imagery tied directly to molecular cell biology. Now we have a plethora of probes that can be cloned. Genetic probes are a large and important modality that people are using now since we’ve gathered so much genome sequence data that needs to be deciphered. We can now visualize protein interactions and we can take apart cell operations in a dynamic environment.”

Fluorescent microscopy thus emerged. In 1992 GFP was cloned from the jellyfish Aequorea, and the ability to link this gene to proteins of interest for transfection into living cells has provided a revolutionary tool for live cell microscopy. The technology to do live cell imaging continued to develop into the 1990s. Numerous genetically encoded probes were developed, and the hardware technology to track them evolved to keep pace.

“People can also use genetically encoded probes to monitor such parameters as ATP production and ion-flux,” notes Garsha. “Fluorescence microscopy is also used to map gene locations on intact chromosomes. Hardware and software development has become a matter of providing researchers with the specialized tools they need to see the broad functionality of cells in a dynamic context.”

In terms of camera quality itself, Ross notes that, “cameras have become much more sensitive, and optics have become much better. Modern cameras have quantum efficiencies over 90%, and filters can now provide about 95% transmission of desired wavelengths and completely block others. Additionally, live cell confocal systems, such as the Live Scan SFC, allow scientists to image living specimens for a long time with minimal bleaching and subsequent phototoxicity.”

“Labs want to see better time resolution in the context of rich multidimensional data sets,” says Garsha. “And companies have responded in every niche imaginable. There are chemical and genetic probes, environmental chambers to keep cells alive, sophisticated automation, and powerful software packages. There is a constant effort to understand what researchers are doing and what they need to answer scientific questions.”

Digital imaging has changed everything. “Image quality and accuracy are really big issues,” says Higgs. “Researchers want to be able to image the cell clearly and true to nature. We see a big shift away from basic microscopy. It is less about looking through the eye pieces and more about the image and data on the computer screen.”

But George notes that the eyepiece is still in use in the 21st century microscope. “A lot of microscopes can be configured without them, but we find that most labs still request eyepieces as part of the package along with the monitor. Some researchers find that it speeds the process of lining up the specimen and making sure it’s in the right place. In some experiments, speed is of the essence to minimize phototoxicity to the cells.”

As with any other purchase, what a lab eventually decides mainly depends on whether the lab is a core lab that requires high flexibility of the application or a smaller lab that will be looking for basic functionality. Another consideration is reliability and downtime. Experiments may become hampered because a microscope is constantly breaking down and the software crashes.

“Support and service might mean a tech comes out with the part and fixes it,”says Garsha. “Or it might mean that you have to ship it back to the company and wait for the repair. What is the cost of ownership? How much is a service contract going to cost? How much do expendables associated with instrument operation cost? Some sort of preventative maintenance schedule may be important in the context of a particular instrument.”

Ease of training and use is another consideration. “Probably the single biggest accomplishment any microscope manufacturer can achieve is making this really complex technology accessible and usable for everyone in the lab,” says Ross. “New technology is available that basically puts live cell imaging in a PC-sized box, controlling CO2, temperature, humidity, and imaging in phase contrast and multiple fluorescence channels. This was designed so that traditional geneticists and biochemists can now do live cell imaging instantly, without the two-year or more learning curve it can take to master such complex applications.”

Close to Perfect

The progress in microscopy over the last few years have made further improvements challenging. “Optics have improved to such a point that we are close to the theoretical limit of perfection,” Ross says. “Right now our optics have numerical apertures of up to 1.49 with a theoretical physical limit of 1.51 with standard oil immersion. Optical quality is at the heart of every microscope system. Not only have numerical apertures increased, but the optics can accommodate a much wider range of wavelengths, especially allowing further advances in near-infrared imaging.”

Experts agree that there are always physical tradeoffs in trying to perfect the technology. “Scientists are looking for a tool that they need to do a particular job,” Ross says. “They might read a paper in a journal and see an application that they need to address a specific question. The easiest way to accomplish this is to copy the imaging system referenced in the paper. It’s all about having the correct tool for the question you are trying to answer.”

Computational power has helped the scientist, and improving signal-to-noise is also a big design consideration. The ability to maximize the signal-to-noise ratio allows scientists to get a better read on whatever they are looking for.

But Garsha notes that researchers still want more. “Many researchers want to have higher temporal resolution at lower light flux, requiring more sophisticated detection systems. Phototoxicity and photobleaching are key problems in the context of live cell imaging. In order to preserve the cell, you need to keep lighting low and use exquisitely sensitive detectors to pick up the signals you are looking for.”

There is and always will be room for improvement, says Higgs. “Whether it is the quality of resolution, the chromatic correction of the light path, the ability to process large volumes of information more quickly, or the incorporation of a new technology, researchers are always pushing us to provide more.”

Previous articleActavis Pays $110M to Gain Control of Abrika’s CR Products
Next articleSanthera and University of Basel Join NMD Network