2021
DOI: 10.21203/rs.3.rs-244137/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials

Abstract: This work presents Neural Equivariant Interatomic Potentials (NequIP), a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. While most contemporary symmetry-aware models use invariant convolutions and only act on scalars, NequIP employs SE(3)-equivariant convolutions for interactions of geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. The method achieves state-of… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
64
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 56 publications
(65 citation statements)
references
References 32 publications
1
64
0
Order By: Relevance
“…LSTMs have recently been used to predict aeroelastic responses across a range of Mach numbers [70]. More generally, equivariant networks seek to encode various symmetries by construction, which should improve accuracy and reduce data requirements for physical systems [71][72][73][74]. Autoencoder networks enforce the physical notion that there should be low-dimensional structure, even for high-dimensional data, by imposing an information bottleneck, given by a constriction of the number of nodes in one or more layers of the network.…”
Section: The Architecturementioning
confidence: 99%
“…LSTMs have recently been used to predict aeroelastic responses across a range of Mach numbers [70]. More generally, equivariant networks seek to encode various symmetries by construction, which should improve accuracy and reduce data requirements for physical systems [71][72][73][74]. Autoencoder networks enforce the physical notion that there should be low-dimensional structure, even for high-dimensional data, by imposing an information bottleneck, given by a constriction of the number of nodes in one or more layers of the network.…”
Section: The Architecturementioning
confidence: 99%
“…First implementations used only interatomic distances, but more recently the full many-body description is utilized e.g. in "equivariant message passing networks" [40]. For high-dimensional systems, a cutoff radius for the information exchange is introduced making the methods feasible for larger systems, and for each message passing step (typically between two and six are employed) the effective range of the description of the atomic environments is increased.…”
Section: Representation and Regressionmentioning
confidence: 99%
“…To predict quantities that are fundamentally generated from equivariant interactions you must either 1) include these equivariant interactions in the scalar featurization used for an invariant model (which requires knowing to include those interactions) or 2) use an equivariant model, which may make more accurate or efficient predictions because it has more expressive operations [4,5].…”
Section: When Invariance Is Not Enoughmentioning
confidence: 99%
“…molecular dynamics forces) from atomic geometries and initial atomic features (e.g. atom types) [5]. You can use them to manipulate atomic geometries and hierarchical features [14].…”
Section: Euclidean Neural Network Are Equivariant Modelsmentioning
confidence: 99%