Specifics of Label-Free Technologies
Label-free sensors measure the changes of cellular phenotypes by specific detection technologies such as impedance, surface plasmon, planar waveguide, or acoustic resonance. Readout changes are related to known and expected changes of the cellular phenotypes such as growth or shrinkage of cells, redistribution of intracellular masses, cellular shape changes, or changing orientation of cells to each other.
A label-free experiment usually starts with measuring the signal of an experimental base status. When stimulating agents (e.g., compounds) are applied to the cells, the resulting phenotypic changes of the cellular phenotypes should cause a signal change. Both the phenotype of the base status (before addition of the compound) and the phenotypic change caused by the compound addition have to be verified external of the instrument (e.g., by verification through a microscope). The signal change corresponds to the difference between these two verified phenotypes.
To better monitor and understand the relation between the signal change and the associated phenotypic change, additional measurements are recorded between the base line and a defined end time. The number and frequency of measurements are determined during assay development based on the confirmed phenotypic changes. Often, there is a specific signal expected between tzero and tend, e.g., a peak indicating a specific temporary, reversible change of the cells. Figure 1 shows example traces from Corning EPIC®.
Simplified (and frequently applied) data-analysis methods just use maximum and base-line measurements. However, the important information in this example lies in the shape of the traces and needs to be used to obtain useful results.
During assay development, experimental conditions and measurement frequencies are defined as well as the data analysis. This includes aggregation of the signal for defined time points (e.g., baseline: mean of first four measurements, peak: maximum signal of time points 12 to 20, etc.). However, during the actual screening experiment, deviations may occur.
- The test substance shows an unexpected effect not captured by the controls used to calibrate the experiment.
- The well shows unexpected issues (e.g., a strong artifact) that dominate the signal of the compound within.
- Signals are shifted from plate to plate—both in strength and time.
These issues are compounded as most screening platforms do not allow analysis of the full-time traces; they only import the aggregated results as defined during assay development. Results are calculated automatically (often in batch mode) by the software accompanying the label-free instrument. This situation has severe data-analysis consequences.
- It becomes next to impossible to see the actual time traces in the context of plate QC or hit-list creation. As a result, important trace trends go undetected reducing a lab’s successful result generation.
- If aggregations must be adapted, data analysis starts from scratch. This requires a parameterization change in the instrument software followed by an export and re-import into the screening software. As a result, one sees inefficient screening, decreased productivity, and the potential for increased error rates.
- When multiple instruments from different vendors are used in the same laboratory, different data aggregation and treatment methods are brought into the lab. As a result, there is inconsistent data processing.