Targeting for Tumor Management
Oncologic applications loom large on the proteomic profiling landscape. Another group will specifically present on Kv10.1 (Eag1) as a target for tumor management. Luis Pardo, M.D., Ph.D., group leader, Max-Planck Institute for Experimental Medicine, says that he will describe different tumor theranostics approaches, taking advantage of the selective expression profile of Eag1 channels in tumors together with its role in tumor biology.
Dr. Pardo’s group has been working on a particular ion channel in tumor cells that is virtually not expressed outside of the brain. “This is a selective marker for tumor cells—normal cells expressing it are protected by the blood-brain barrier,” he explains. “It allows an aggressive therapy with low side effects. We have developed some blocking antibodies for the channel in animal models. Side effects are low, but the treatment is not as effective as we would like. We are developing combinations between the antibody and cytotoxic compounds.”
Dr. Pardo adds that his group’s approach is standard, “but what is unique about what we are doing is the target. Up until now, tumor markers are generally specific for certain tumor types—this has a wide range in tumor types; one is not restricted to specific tumor types. It also has the advantage of dealing with intact cells. You don’t have to get into the cell—it’s on the surface.”
Screening against individual therapeutic proteins can identify hit compounds. Wayne Bowen, Ph.D., CSO at TTP LabTech, notes that a systems biology approach is more ideally suited for target identification and mechanism of action studies, and that high-content methodologies are addressing this need in a higher-throughput environment.
“We are part of a large group, The Technology Partnership. It’s a pretty special setup; we sell both instrumentation and development consultancy, although we are a bit different in our approach,” points out Dr. Bowen. “We provide novel tools to our clients, who dictate which instrumentation comes to the market.”
In his presentation, Dr. Bowen will describe Acumen eX3’s laser-scanning approach, which enables the scanning of entire wells, cell by cell, ideal for analysis of large objects and improved assay robustness, he says. Customer-driven software applies cytometric principles rather than image analysis, making for less labor-intensive screening with a broad range of high content assays.
“We work in large area screening—20 mm x 20 mm is a large enough area to simultaneously scan four wells in a 96-well plate,” he says. “What Acumen offers is a wide field-of-view, rapidly reporting high-content data for every cell in every well.”
Dr. Bowen notes that this technology still has restricted use despite its broad applicability. “The Acumen is used for sustained 24/7 high-content screening, essentially highly automated analysis of lots of plates with little user intervention. The instrument offers high-throughput analysis and gives you data for all cells. People make mistakes, and so do robots. If you can count all cells in each well, you can normalize the data. Whole-well scanning thus adds to the robustness of the data.”
Dr. Bowen describes one instance where speed was key. “In seven days, we were able to scan 18,000 genes twice using RNAi profiling, and able to find 1,000 interesting genes. The secondary confirmation screen was done in three hours. I received 13 384-well plates at 9:00 am and had them read by 12:15 pm. This kind of follow-up allows you to be successful, fast.”
According to Dr. Bowen, producing high volumes of data can be counterproductive. “We process terabytes of information, but throw it away as soon as the high-content parameters have been determined. There is no requirement for a long-term data-storage solution. You develop and validate an assay and run with it. A key point is that researchers now want to screen at higher throughput. Why is high-content analysis not used as much? Not every technology is directly transferable. They don’t last or can’t be used in every instance.”