2023
DOI: 10.1002/adma.202210788
|View full text |Cite
|
Sign up to set email alerts
|

Machine‐Learning‐Assisted Determination of the Global Zero‐Temperature Phase Diagram of Materials

Abstract: Crystal‐graph attention neural networks have emerged recently as remarkable tools for the prediction of thermodynamic stability. The efficacy of their learning capabilities and their reliability is however subject to the quantity and quality of the data they are fed. Previous networks exhibit strong biases due to the inhomogeneity of the training data. Here a high‐quality dataset is engineered to provide a better balance across chemical and crystal‐symmetry space. Crystal‐graph neural networks trained with thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 28 publications
(10 citation statements)
references
References 89 publications
0
10
0
Order By: Relevance
“…We started our transfer learning experiments by training crystal graph-attention neural networks 35 on a PBE dataset with 1.8 M structures 18 from the DCGAT database, and on the extended PBEsol and SCAN datasets from ref. 24.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We started our transfer learning experiments by training crystal graph-attention neural networks 35 on a PBE dataset with 1.8 M structures 18 from the DCGAT database, and on the extended PBEsol and SCAN datasets from ref. 24.…”
Section: Resultsmentioning
confidence: 99%
“…13 The list includes AFLOW, 14 the OQMD, 15,16 the Materials Project, 17 and DCGAT. 18 An exception is the smaller JARVIS database that encompasses ∼55 k calculations obtained using OptB88-vdW 19,20 and the modied Becke-Johnson potential. [21][22][23] The rst large databases beyond the PBE functional have been published only recently.…”
Section: Introductionmentioning
confidence: 99%
“…The database was primarily generated by scanning binary, ternary, and quaternary prototypes to identify stable compounds. This process employed crystal graph attention networks 114,116,117 to predict the stability of all potential compositions for each prototype. Compounds that were found to be close to stability were subsequently confirmed using DFT.…”
Section: Alexandriamentioning
confidence: 99%
“…), which is fixed for the compounds in the training set. Including structural information in the definition of a model mainly improves the predictions if large training datasets (>10 5 data points) are used . Graph convolutional neural networks have notably been used to predict convex hull distances accurately and benefit greatly from structural features. , Note that these can also be constructed with compositional information only. , One downside to the inclusion of structural information in the models is that the optimized structure is not known prior to the search, so data for unrelaxed structures has to be used. This can notably be corrected by using ML interatomic potentials (MLIAPs), which are capable of performing relaxations.…”
Section: Introductionmentioning
confidence: 99%
“… 11 14 Including structural information in the definition of a model mainly improves the predictions if large training datasets (>10 5 data points) are used. 15 Graph convolutional neural networks 16 18 have notably been used to predict convex hull distances accurately and benefit greatly from structural features. 19 , 20 Note that these can also be constructed with compositional information only.…”
Section: Introductionmentioning
confidence: 99%