Predicting unanticipated harmful effects of chemicals and drug molecules is a difficult and costly task. Here we utilize a ‘big data compacting and data fusion’—concept to capture diverse adverse outcomes on cellular and organismal levels. The approach generates from transcriptomics data set a ‘predictive toxicogenomics space’ (PTGS) tool composed of 1,331 genes distributed over 14 overlapping cytotoxicity-related gene space components. Involving ∼2.5 × 108 data points and 1,300 compounds to construct and validate the PTGS, the tool serves to: explain dose-dependent cytotoxicity effects, provide a virtual cytotoxicity probability estimate intrinsic to omics data, predict chemically-induced pathological states in liver resulting from repeated dosing of rats, and furthermore, predict human drug-induced liver injury (DILI) from hepatocyte experiments. Analysing 68 DILI-annotated drugs, the PTGS tool outperforms and complements existing tests, leading to a hereto-unseen level of DILI prediction accuracy.
Radiotherapy remains the backbone of head and neck cancer therapy but response is sometimes impeded by tumor radioresistance. Identifying predictive biomarkers of radiotherapy response is a crucial step towards personalized therapy. The aim of this study was to explore gene expression data in search of biomarkers predictive of the response to radiotherapy in head and neck squamous cell carcinoma (HNSCC). Microarray analysis was performed on five cell lines with various intrinsic radiosensitivity, selected from a panel of 29 HNSCC cell lines. The bioinformatics approach included Gene Ontology (GO) enrichment profiling and Ingenuity Pathway Analysis (IPA). The GO-analysis detected 16 deregulated categories from which development, receptor activity, and extracellular region represented the largest groups. Fourteen hub genes (CEBPA, CEBPB, CTNNB1, FN1, MYC, MYCN, PLAU, SDC4, SERPINE1, SP1, TAF4B, THBS1, TP53 and VLDLR) were identified from the IPA network analysis. The hub genes in the highest ranked network, (FN1, SERPINE1, THBS1 and VLDLR) were further subjected to qPCR analysis in the complete panel of 29 cell lines. Of these genes, high FN1 expression associated to high intrinsic radiosensitivity (p=0.047). In conclusion, gene ontologies and hub genes of importance for intrinsic radiosensitivity were defined. The overall results suggest that FN1 should be explored as a potential novel biomarker for radioresistance.
The aim of the SEURAT‐1 (Safety Evaluation Ultimately Replacing Animal Testing‐1) research cluster, comprised of seven EU FP7 Health projects co‐financed by Cosmetics Europe, is to generate a proof‐of‐concept to show how the latest technologies, systems toxicology and toxicogenomics can be combined to deliver a test replacement for repeated dose systemic toxicity testing on animals. The SEURAT‐1 strategy is to adopt a mode‐of‐action framework to describe repeated dose toxicity, combining in vitro and in silico methods to derive predictions of in vivo toxicity responses. ToxBank is the cross‐cluster infrastructure project whose activities include the development of a data warehouse to provide a web‐accessible shared repository of research data and protocols, a physical compounds repository, reference or “gold compounds” for use across the cluster (available via wiki.toxbank.net), and a reference resource for biomaterials. Core technologies used in the data warehouse include the ISA‐Tab universal data exchange format, REpresentational State Transfer (REST) web services, the W3C Resource Description Framework (RDF) and the OpenTox standards. We describe the design of the data warehouse based on cluster requirements, the implementation based on open standards, and finally the underlying concepts and initial results of a data analysis utilizing public data related to the gold compounds.
Toxicological research faces the challenge of integrating knowledge from diverse fields and novel technological developments generally in the biological and medical sciences. We discuss herein the fact that the multiple facets of cancer research, including discovery related to mechanisms, treatment and diagnosis, overlap many up and coming interest areas in toxicology, including the need for improved methods and analysis tools. Common to both disciplines, in vitro and in silico methods serve as alternative investigation routes to animal studies. Knowledge on cancer development helps in understanding the relevance of chemical toxicity studies in cell models, and many bioinformatics-based cancer biomarker discovery tools are also applicable to computational toxicology. Robotics-aided, cell-based, high-throughput screening, microscale immunostaining techniques and gene expression profiling analyses are common tools in cancer research, and when sequentially combined, form a tiered approach to structured safety evaluation of thousands of environmental agents, novel chemicals or engineered nanomaterials. Comprehensive tumour data collections in databases have been translated into clinically useful data, and this concept serves as template for computer-driven evaluation of toxicity data into meaningful results. Future 'cancer research-inspired knowledge management' of toxicological data will aid the translation of basic discovery results and chemicals-and materials-testing data to information relevant to human health and environmental safety.Assessing the intrinsic toxicological properties of environmental agents is central to defining hazard within the paradigm of human risk assessment used now for decades [1]. A constantly increasing number of chemicals and engineered nanomaterials (ENMs) emphasize the need of applying novel tools and screening technologies outside of the standard, both lengthy and expensive, rodent toxicological tests traditionally used in such work [1][2][3][4][5][6][7][8][9][10][11][12][13]. Especially considered for the future innovation of novel ENMs, safety evaluations should be integrated proactively and efficiently already in the material and product development phase [9,12]. Summarized under a concept termed 'Toxicity Testing in the 21st Century' or 'Tox21', biochemical and cell-based in vitro assays coupled with bioinformatics and modelling-driven in silico assays are now considered key to transforming toxicology from a previously animal-based testing practice into a computational science built on systems biology [2,4,6,[9][10][11]13]. The resulting novel research field is variably termed 'systems toxicology', 'toxicogenomics' or 'computational toxicology' [3,4,8,14,15]. Such effort typically relies on informatics-driven and modelling-based analyses of results from several experimental systems and data-rich technologies for measuring pools of biological molecules, such as mRNAs, the overall aim being to interdependently analyse toxicity data and profiling results for understanding...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.