2019
DOI: 10.1021/acs.chemmater.9b01294
|View full text |Cite
|
Sign up to set email alerts
|

Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals

Abstract: Graph networks are a new machine learning (ML) paradigm that supports both relational reasoning and combinatorial generalization. Here, we develop universal Ma-tErials Graph Network (MEGNet) models for accurate property prediction in both molecules and crystals. We demonstrate that the MEGNet models outperform prior ML models such as the SchNet in 11 out of 13 properties of the QM9 molecule data set. Similarly, we show that MEGNet models trained on ∼ 60, 000 crystals in the Materials Project substantially outp… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

6
1,000
0
4

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3

Relationship

1
5

Authors

Journals

citations
Cited by 943 publications
(1,010 citation statements)
references
References 61 publications
6
1,000
0
4
Order By: Relevance
“…Finally, graph‐based featurization has gained substantial interest in recent years. Graphs, which are natural representations for atoms (nodes) and the bonds between them (edges), have been used for molecules for many decades and have recently been applied to ML in crystals, achieving state‐of‐the‐art performance in predicting the formation energies, bandgaps, as well as metal/insulator classification …”
Section: Featurizationmentioning
confidence: 99%
See 4 more Smart Citations
“…Finally, graph‐based featurization has gained substantial interest in recent years. Graphs, which are natural representations for atoms (nodes) and the bonds between them (edges), have been used for molecules for many decades and have recently been applied to ML in crystals, achieving state‐of‐the‐art performance in predicting the formation energies, bandgaps, as well as metal/insulator classification …”
Section: Featurizationmentioning
confidence: 99%
“…While PCA works on linear projection, the manifold learning is able to capture nonlinear relationships. For example, the t‐distributed stochastic neighbor embedding (t‐SNE) method learns low‐dimensional representations such that the local distance between data points is roughly preserved and has been applied in visualizing the elemental embedding vector trained from materials property prediction models, structural similarity of perovskites, word embeddings in text mining, electronic fingerprints, etc.…”
Section: Featurizationmentioning
confidence: 99%
See 3 more Smart Citations