2019
DOI: 10.1021/acs.jpclett.9b02422
|View full text |Cite
|
Sign up to set email alerts
|

Machine Learning the Physical Nonlocal Exchange–Correlation Functional of Density-Functional Theory

Abstract: We train a neural network as the universal exchange-correlation functional of density-functional theory that simultaneously reproduces both the exact exchange-correlation energy and potential. This functional is extremely non-local, but retains the computational scaling of traditional local or semi-local approximations. It therefore holds the promise of solving some of the delocalization problems that plague density-functional theory, while maintaining the computational efficiency that characterizes the Kohn-S… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
79
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 88 publications
(79 citation statements)
references
References 56 publications
0
79
0
Order By: Relevance
“…Recently, machine-learning (ML)-based DFAs have been shown to break the constraints of Jacob's ladder, offering highly accurate, pure, and non-local density functionals for different one-dimensional model systems [10][11][12][13] . Although these approaches show that it is possible to learn the highly non-linear mapping from the electron density to the energy from data, they cannot be directly transferred to real systems.…”
mentioning
confidence: 99%
“…Recently, machine-learning (ML)-based DFAs have been shown to break the constraints of Jacob's ladder, offering highly accurate, pure, and non-local density functionals for different one-dimensional model systems [10][11][12][13] . Although these approaches show that it is possible to learn the highly non-linear mapping from the electron density to the energy from data, they cannot be directly transferred to real systems.…”
mentioning
confidence: 99%
“…The first relies on the pragmatic application of system-dependent machine learning corrections to the interaction energy of approximate functionals with, for instance, ∆-ML. The second approach, which has been successfully applied to two-electrons, one-dimensional systems, 116…”
Section: Resultsmentioning
confidence: 99%
“…The ML methods considered above are ignorant of symmetry. This can be addressed by explicit or implicit symmetrization [46,98,136,157]. ML typically requires relatively costly optimization of parameters (NN) or hyper-parameters (GPR, KRR) or extensive evolutionary searches (GA-inspired schemes).…”
Section: Discussionmentioning
confidence: 99%
“…Custodio et al [97] used single-and two-hidden layer NN to learn an LSDA type functional for a one-dimensional Hubbard model. Schmidt et al [98] used a neural network to learn the universal exchange-correlation functional that simultaneously reproduces both the exact exchange-correlation energy and the potential, for a one-dimensional system with two strongly correlated electrons, in a non-singular potential. A multi-layer NN was used with exponential linear neurons (rather than common sigmoidal neurons), and symmetry was inbuilt into it by fixing of first layer weights.…”
Section: Exchange-correlation Functionalsmentioning
confidence: 99%