2022
DOI: 10.48550/arxiv.2205.08366
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Deep-learning Model for Fast Prediction of Vacancy Formation in Diverse Materials

Abstract: The presence of point defects such as vacancies plays an important role in material design. Here, we demonstrate that a graph neural network (GNN) model trained only on perfect materials can also be used to predict vacancy formation energies (E vac ) of defect structures without the need for additional training data. Such GNN-based predictions are considerably faster than density functional theory (DFT) calculations with reasonable accuracy and show the potential that GNNs are able to capture a functional form… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

2
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 37 publications
2
1
0
Order By: Relevance
“…This pat-383 tern of comparable accuracy but increased generalizability also 384 holds true for our method relative to other machine learning ef-385 forts for defect predictions, e.g., Frey et al's model for transition 386 metal dichalcogenides24 with MAE=0.67 eV and Cheng et al's 387 model for amorphous GeTe 70. Finally, in a concurrent preprint 388 with ours,71,72 Choudary et al used graph neural networks mod-389 els to predict total energy of a host structure and with an atom re-390 moved to estimate vacancy formation enthalpies, but this neglects 391 the relaxation of the host upon vacancy formation and yields an 392 MAE prediction of 1.5 eV for a single test set, including 2.3 eV for 393 oxides.…”
supporting
confidence: 55%
“…This pat-383 tern of comparable accuracy but increased generalizability also 384 holds true for our method relative to other machine learning ef-385 forts for defect predictions, e.g., Frey et al's model for transition 386 metal dichalcogenides24 with MAE=0.67 eV and Cheng et al's 387 model for amorphous GeTe 70. Finally, in a concurrent preprint 388 with ours,71,72 Choudary et al used graph neural networks mod-389 els to predict total energy of a host structure and with an atom re-390 moved to estimate vacancy formation enthalpies, but this neglects 391 the relaxation of the host upon vacancy formation and yields an 392 MAE prediction of 1.5 eV for a single test set, including 2.3 eV for 393 oxides.…”
supporting
confidence: 55%
“…This pat-383 tern of comparable accuracy but increased generalizability also 384 holds true for our method relative to other machine learning ef-385 forts for defect predictions, e.g., Frey et al's model for transition 386 metal dichalcogenides24 with MAE=0.67 eV and Cheng et al's 387 model for amorphous GeTe 70. Finally, in a concurrent preprint 388 with ours,71,72 Choudary et al used graph neural networks mod-389 els to predict total energy of a host structure and with an atom re-390 moved to estimate vacancy formation enthalpies, but this neglects 391 the relaxation of the host upon vacancy formation and yields an 392 MAE prediction of 1.5 eV for a single test set, including 2.3 eV for 393 oxides.…”
supporting
confidence: 55%
“…34,35 We developed an atomistic line graph neural network (ALIGNN) in our previous work 36 which can capture many body interactions in graph and successfully models more than 70 properties of materials, either scalar or vector quantities, such as formation energy, bandgap, elastic modulus, superconducting properties, adsorption isotherm, electron and density of states etc. [36][37][38][39][40][41] The same automatic differentiation capability that allows training these complex models allows for physically consistent prediction of quantities such as forces and energies; this enables GNNs to be used in quickly identifying relaxed or equilibrium states of complex systems. However, there is a need for a large and diverse amount of data to train unied force-elds.…”
Section: Introductionmentioning
confidence: 99%