2019
DOI: 10.1021/acs.jpclett.9b02037
|View full text |Cite
|
Sign up to set email alerts
|

Embedded Atom Neural Network Potentials: Efficient and Accurate Machine Learning with a Physically Inspired Representation

Abstract: We propose a simple, but efficient and accurate machine learning (ML) model for developing high-dimensional potential energy surface. This so-called embedded atom neural network (EANN) approach is inspired by the well-known empirical embedded atom method (EAM) model used in condensed phase. It simply replaces the scalar embedded atom density in EAM with a Gaussian-type orbital based density vector, and represents the complex relationship between the embedded density vector and atomic energy by neural networks.… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
300
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 225 publications
(301 citation statements)
references
References 37 publications
1
300
0
Order By: Relevance
“…Both networks use the ResNet architecture 75 . The size of the embedding network was set to (25,50,100) and the size of the embedding matrix was set to 12. The size of the fitting network is set to (240, 240, 240).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Both networks use the ResNet architecture 75 . The size of the embedding network was set to (25,50,100) and the size of the embedding matrix was set to 12. The size of the fitting network is set to (240, 240, 240).…”
Section: Methodsmentioning
confidence: 99%
“…Neural networks constitute a very flexible and unbiased class of mathematical functions, which in principle is able to approximate any real-valued function to arbitrary accuracy. Since Behler and Parrinello proposed the high-dimensional neural network approach 19,20 , several methods have been developed to implement this approach and many different kind of NN PESs have been proposed for water, small organic molecules, and metalloid materials [21][22][23][24][25] . For example, the sGDML [26][27][28] , SchNet 29 , PhysNet 30 , and FCHL 31 methods.…”
mentioning
confidence: 99%
“…Their PESs are fitted to different forms of MLPs (like the Gaussian process or deep neural networks), enabling fast GO of up to 100 atoms with an accuracy close to that of first‐principles methods. There are several forms of MLPs in this field, like Behler–Parrinello symmetry functions [277], deep potentials [278], and embedded atom neural network potetial [279]. There is no consensus regarding which is the most suitable approach for GO of cluster structures.…”
Section: Challenges and Perspectivementioning
confidence: 99%
“…[48,49] For quantum molecular properties, geometry (and atom type) are typically chosen as input because even the slightest changes in geometry can affect the wavefunction and its observables. [50][51][52][53] On the other hand, macroscopic properties are more robust to higher-level or coarse-gained descriptors. [54][55][56][57][58] In the case of melting, for example, a molecular crystal may be identified by descriptors or a molecular fingerprint [59][60][61][62] derived from its repeating structural unit.…”
Section: Introductionmentioning
confidence: 99%