2023
DOI: 10.1021/acs.jctc.3c00868
|View full text |Cite
|
Sign up to set email alerts
|

Analyzing the Accuracy of Critical Micelle Concentration Predictions Using Deep Learning

Alexander Moriarty,
Takeshi Kobayashi,
Matteo Salvalaglio
et al.

Abstract: This paper presents a novel approach to predicting critical micelle concentrations (CMCs) by using graph neural networks (GNNs) augmented with Gaussian processes (GPs). The proposed model uses learned latent space representations of molecules to predict CMCs and estimate uncertainties. The performance of the model on a data set containing nonionic, cationic, anionic, and zwitterionic molecules is compared against a linear model that works with extended connectivity fingerprints (ECFPs). The GNN-based model per… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
3
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 69 publications
(127 reference statements)
0
3
1
Order By: Relevance
“…This indicates a slightly worse performance on a less diverse test set compared to our GNN model with an RMSE of 0.24 evaluated on 218 different points, in which 100 distinct surfactant structures are included; i.e., for many surfactant structures, measurements at more than one temperature are present. In another recent work, Moriarty et al analyzed the GNN model developed by Qin et al and reported a combined GNN model with Gaussian process (GP) with a test RMSE of 0.21, which is slightly lower than our findings. However, the test set was again limited to 22 molecules, which is only about 20% of ours and therefore may not represent the diversity of surfactant structures used in different practical applications.…”
Section: Resultscontrasting
confidence: 82%
See 1 more Smart Citation
“…This indicates a slightly worse performance on a less diverse test set compared to our GNN model with an RMSE of 0.24 evaluated on 218 different points, in which 100 distinct surfactant structures are included; i.e., for many surfactant structures, measurements at more than one temperature are present. In another recent work, Moriarty et al analyzed the GNN model developed by Qin et al and reported a combined GNN model with Gaussian process (GP) with a test RMSE of 0.21, which is slightly lower than our findings. However, the test set was again limited to 22 molecules, which is only about 20% of ours and therefore may not represent the diversity of surfactant structures used in different practical applications.…”
Section: Resultscontrasting
confidence: 82%
“…An alternative end-to-end deep learning approach, called graph neural networks (GNNs), has been successfully applied to numerous molecular property prediction tasks. GNNs are applied directly to the molecular graph and extract the necessary structural information, which they later use to predict the target property, thereby providing an end-to-end learning framework . Due to their broad success and adoption, GNNs have been effectively applied to predict the CMC and surface excess concentration (Γ m ) of surfactant monomers. For both approaches, i.e., QSPR and GNNs, the temperature dependence of CMC is rarely studied. In fact, the effect of the temperature on the CMC has only been modeled in one recently published QSPR model, which is however limited to anionic surfactants .…”
Section: Introductionmentioning
confidence: 99%
“…In this study, the property prediction model adopted the graph neural network (GNN) approach due to their superior performance (i.e., closer approximation to experimental data) and to avoid the need for informative descriptors as in quantitative structure–property relationships (QSPRs). Specifically, GNNs which are based on molecular graphs are intuitive and flexible data representations that encode information on component atoms and atom connectivity (e.g., they preserve topological information on the molecular structure through an adjacency matrix) (Figure ). The underlying structure of the adopted GNN model closely mirrors that proposed by Qin et al and Moriarty et al At the outset, the GNN model integrates a sequence of graph network layers that adapt node features in a molecular graph based on those of interconnected atoms. The GNN model, devised by Kipf and Welling, uses an efficient layerwise propagation rule that is based on a first-order approximation of spectral convolutions of graphs .…”
Section: Methodsmentioning
confidence: 86%
“…al. [14], and Moriarty et al [30]. At the outset, the GNN model integrates a sequence of graph network layers which adapt node features in a molecular graph based on those of interconnected atoms.…”
Section: Predictive Modelmentioning
confidence: 99%