Human health and animal health risk assessment of combined exposure to multiple chemicals use the same steps as single-substance risk assessment, namely problem formulation, exposure assessment, hazard assessment and risk characterisation. The main unique feature of combined RA is the assessment of combined exposure, toxicity and risk. Recently, the Scientific Committee of the European Food Safety Authority (EFSA) published two relevant guidance documents. The first one “Harmonised methodologies for the human health, animal health and ecological risk assessment of combined exposure to multiple chemicals” provides principles and explores methodologies for all steps of risk assessment together with a reporting table. This guidance supports also the default assumption that dose addition is applied for combined toxicity of the chemicals unless evidence for response addition or interactions (antagonism or synergism) is available. The second guidance document provides an account of the scientific criteria to group chemicals in assessment groups using hazard-driven criteria and prioritisation methods, i.e., exposure-driven and risk-based approaches. This manuscript describes such principles, provides a brief description of EFSA’s guidance documents, examples of applications in the human health and animal health area and concludes with a discussion on future challenges in this field.
Dimensionality reduction techniques are crucial for enabling deep learning driven quantitative structure-activity relationship (QSAR) models to navigate higher dimensional toxicological spaces, however the use of specific techniques is often arbitrary and poorly explored. Six dimensionality techniques (both linear and non-linear) were hence applied to a higher dimensionality mutagenicity dataset and compared in their ability to power a simple deep learning driven QSAR model, following grid searches for optimal hyperparameter values. It was found that comparatively simpler linear techniques, such as principal component analysis (PCA), were sufficient for enabling optimal QSAR model performances, which indicated that the original dataset was at least approximately linearly separable (in accordance with Cover’s theorem). However certain non-linear techniques such as kernel PCA and autoencoders performed at closely comparable levels, while (especially in the case of autoencoders) being more widely applicable to potentially non-linearly separable datasets. Analysis of the chemical space, in terms of XLogP and molecular weight, uncovered that the vast majority of testing data occurred within the defined applicability domain, as well as that certain regions were measurably more problematic and antagonised performances. It was however indicated that certain dimensionality reduction techniques were able to facilitate uniquely beneficial navigations of the chemical space.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.