Summary Despite wide-spread consensus on the need to transform toxicology and risk assessment in order to keep pace with technological and computational changes that have revolutionized the life sciences, there remains much work to be done to achieve the vision of toxicology based on a mechanistic foundation. A workshop was organized to explore one key aspect of this transformation – the development of Pathways of Toxicity (PoT) as a key tool for hazard identification based on systems biology. Several issues were discussed in depth in the workshop: The first was the challenge of formally defining the concept of a PoT as distinct from, but complementary to, other toxicological pathway concepts such as mode of action (MoA). The workshop came up with a preliminary definition of PoT as “A molecular definition of cellular processes shown to mediate adverse outcomes of toxicants”. It is further recognized that normal physiological pathways exist that maintain homeostasis and these, sufficiently perturbed, can become PoT. Second, the workshop sought to define the adequate public and commercial resources for PoT information, including data, visualization, analyses, tools, and use-cases, as well as the kinds of efforts that will be necessary to enable the creation of such a resource. Third, the workshop explored ways in which systems biology approaches could inform pathway annotation, and which resources are needed and available that can provide relevant PoT information to the diverse user communities.
Nuclear factor (NF)-kappaB p50 protein is involved in promoting survival in hippocampal neurons after trimethyltin (TMT)-injury. In the current study, hippocampal NF-kappaB activity was examined and quantitated from transgenic kappaB-lacZ reporter mice after chemical-induced injury. NF-kappaB activity was localized primarily to hippocampal neurons and significantly elevated over that in saline-treated mice between 4 and 21 days after TMT injection. Seven days after TMT injection, a timepoint of elevated NF-kappaB activity, gene expression in the hippocampus was studied by microarray analysis through comparison of expression profiles between treated nontransgenic and p50-null mice with their saline-injected controls. Seventeen genes increased in nontransgenic TMT-treated mice relative to saline-treated as well as showing no increase in p50-null mice, indicating a role for p50 in their regulation. One of these genes, the Na+, K+-ATPase-gamma subunit, was detected in brain for the first time. Several of the genes modulated by NF-kappaB are potentially related to neuroplasticity, providing additional evidence that this transcription factor is a neuroprotective signal in the hippocampus.
Toxicology has made steady advances over the last 60+ years in understanding the mechanisms of toxicity at an increasingly finer level of cellular organization. Traditionally, toxicological studies have used animal models. However, the general adoption of the principles of 3R (Replace, Reduce, Refine) provided the impetus for the development of in vitro models in toxicity testing. The present commentary is an attempt to briefly discuss the transformation in toxicology that began around 1980. Many genes important in cellular protection and metabolism of toxicants were cloned and characterized in the 80s, and gene expression studies became feasible, too. The development of transgenic and knockout mice provided valuable animal models to investigate the role of specific genes in producing toxic effects of chemicals or protecting the organism from the toxic effects of chemicals. Further developments in toxicology came from the incorporation of the tools of "omics" (genomics, proteomics, metabolomics, interactomics), epigenetics, systems biology, computational biology, and in vitro biology. Collectively, the advances in toxicology made during the last 30-40 years are expected to provide more innovative and efficient approaches to risk assessment. A goal of experimental toxicology going forward is to reduce animal use and yet be able to conduct appropriate risk assessments and make sound regulatory decisions using alternative methods of toxicity testing. In that respect, Tox21 has provided a big picture framework for the future. Currently, regulatory decisions involving drugs, biologics, food additives, and similar compounds still utilize data from animal testing and human clinical trials. In contrast, the prioritization of environmental chemicals for further study can be made using in vitro screening and computational tools.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.