March 15, 2012 (Vol. 32, No. 6)

Arthur Weissman, Ph.D. VP and CSO PerkinElmer
Benjamin Lineberry Project Manager PerkinElmer

The field of safety toxicology is entering a new era shaped by the burden of evaluating the ever-growing number and diversity of chemical entities to determine their impact on the environment and human health. The need for establishing a new way to obtain critical information on the possible toxic effects of particular chemical exposures are further influenced by the associated costs, time, and the extensive animal experimentation that traditional risk assessment requires.

In particular, the time window for obtaining this information is urgent in light of the new regulatory requirements that are proposed or already in place in the U.S. and abroad. One approach to these issues is to develop a series of in vitro measurements that can not only serve as a first-line estimate of a compound’s potential toxicity, but also provide insight into possible mechanisms of that toxicity.

Such information from a large number of compounds can be rapidly assembled through high-throughput screening of relevant in vitro assays and then combined with a database of compounds with known in vivo toxicity data to extract possible similarities.

For the pharmaceutical industry, human safety concerns for new chemical entities are a continual issue since these compounds are targeted for human use. Within this industry, a finite set of assays is used to test for potential safety liabilities prior to their introduction to human populations.

The questions arise as to whether these existing in vitro datasets are comprehensive enough and if they are they used in the best way to address toxicity concerns, especially in predicting the effects of compounds that are developed beyond traditional pharmaceutical chemical space.

The ultimate success of a predictive toxicology approach to answer some of these safety concerns requires several key elements. Primarily among these is the availability of well-defined in vitro measures that reflect the response of important biological processes to xenobiotics. A vital part of this process is the selection of assays that provide data on the inhibition or activation of pathways that are central to mediating an adverse effect.

Many of the targets in such pathways may not have been identified yet. In these cases, we rely on correlative analysis with in vivo data to highlight those assays that provide a reliable marker of possible safety concern, even though we presently may not understand their role in that effect.

Additional factors that will contribute to this approach are both the quality and depth of the in vitro dataset. The data must come from high signal-to-noise assays that unambiguously define a reliable response and are able to differentiate it from artifacts such as natural fluorescence that are common to certain classes of compounds.

These results will reflect both positive and negative information, which are both important to the ultimate goal of predicting responses in humans. It is also imperative that the screening matrix be complete; that is, all compounds must be tested in all the relevant assays. This allows compounds to be ranked for activity across a continuous scale to optimize statistical power.

Finally, the compound sets must be information rich, covering a wide chemical space with overlap among compounds within that space allowing statistical confidence in the conclusion concerning specific chemical and biological interactions.

EPA’s ToxCast Program

This technology is already being applied as part of the Environmental Protection Agency’s (EPA) ToxCast program.

The EPA’s future strategic plan and long-term view “…is to identify in vitro assays that can predict the toxicity of chemical compounds in humans and animals, by comparing the results of in vitro testing with the toxicity data in the EPA’s in vivo toxicity database, ToxRefDB, and then to employ those predictive tests to supplement or replace existing animal-based tests, reducing cost and improving the speed of regulatory approval of new environmental chemicals.”

While predictive toxicology based on in vitro data may not be sufficiently advanced yet to replace traditional animal experimentation, it could provide the basis for quick and economical triage of new and existing chemical entities before more extensive animal studies.

ToxCast and related initiatives involve creating large datasets of in vitro data and then employing in silico methods to correlate that information with known animal, and in some cases, human toxicity data. We have demonstrated that biological signatures composed of readouts from several relevant assays can be derived from this dataset for specific adverse outcomes.

This approach can also provide a means to explore biological mechanisms and pathways central to a compound’s toxicity and suggest ways to ameliorate those effects. In fact, the data from ToxCast has already revealed unsuspected target effects associated with observed chemical toxicity in vivo.

In the EPA ToxCast program, there is a unique opportunity to further develop tools that will provide the science of predictive toxicology with a prominent role in quickly and cost-effectively assessing the human and environmental safety of a variety of chemical entities.

Arthur Weissman, Ph.D. ([email protected]), is vp and CSO, and Benjamin Lineberry is a project manager for PerkinElmer’s Life Sciences & Technology Division.

Previous articleReport Cautions More Drugs, Diagnostics Needed in Europe for Multiple Sclerosis
Next articleHorizon, RFUMS to Generate Human Cell Cystic Fibrosis Model