2023
DOI: 10.1021/acs.chemmater.3c00715
|View full text |Cite
|
Sign up to set email alerts
|

Importance of Structural Features and the Influence of Individual Structures of Graphene Oxide Using Shapley Value Analysis

Amanda S. Barnard,
Bronwyn L. Fox

Abstract: The application of machine learning (ML) to materials chemistry can accelerate the design process, and when coupled with a detailed explanation, can guide future research. Shapley value analysis is a complementary approach capable of providing a comprehensive analysis of the underlying reasons behind a structure/property relationship. In this study, we have used data sets of graphene oxide nanomaterials generated using electronic structure simulations to train ML models with outstanding accuracy, generalizabil… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 50 publications
0
3
0
Order By: Relevance
“…For instance, the spread of the Se data points or variance distribution of its SHAP values is larger for the UWAT_TE data compared to the temperature feature. For the other datasets, the temperature feature is considered the most important one by SHAP due to the wide distribution of its individual data points with SHAP values from the center SHAP value 0.0, as SHAP orders the feature importance based on the higher variance distribution of data points from the center of the SHAP value …”
Section: Resultsmentioning
confidence: 99%
“…For instance, the spread of the Se data points or variance distribution of its SHAP values is larger for the UWAT_TE data compared to the temperature feature. For the other datasets, the temperature feature is considered the most important one by SHAP due to the wide distribution of its individual data points with SHAP values from the center SHAP value 0.0, as SHAP orders the feature importance based on the higher variance distribution of data points from the center of the SHAP value …”
Section: Resultsmentioning
confidence: 99%
“…Since ensemble tree models are not intrinsically interpretable, 59 we choose to use SHAP (SHapley Additive exPlanations) values as an extrinsic tool for statistical interpretability 43 . SHAP has been recently utilized to understand structural features that lead to electrocatalysis in N-doped carbon materials 60 and electronic structure of graphene oxide 61 . Briefly, SHAP trains a given model with every permutation of input features and generates predictions on the data.…”
Section: ϵ 2dmentioning
confidence: 99%
“…Single chain properties are screened by Shapley additive explanations (SHAP) values to enhance the model performance through optimization of input features. 40,41 The correlation between the single chain properties and self-assembly morphologies is also analyzed using SHAP values. Phase diagrams of bottlebrush copolymers with given side chain sequences are predicted by single chain properties obtained from GCNs.…”
Section: Introductionmentioning
confidence: 99%