2022
DOI: 10.1038/s41467-022-29939-5
|View full text |Cite
|
Sign up to set email alerts
|

E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials

Abstract: This work presents Neural Equivariant Interatomic Potentials (NequIP), an E(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. While most contemporary symmetry-aware models use invariant convolutions and only act on scalars, NequIP employs E(3)-equivariant convolutions for interactions of geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. The method achieves state-of-… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

6
701
1
5

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 727 publications
(713 citation statements)
references
References 43 publications
6
701
1
5
Order By: Relevance
“…For the rest of the paper, we will assume that they are encoded in spherical coordinates. This is in line with many equivariant models such as SOAP-GAP [2], SNAP [34], ACE [5,35] and its recursive implementations such as [36], NICE [37], NequIP [26], equivariant transformer [24], and SEGNNs [25]. By contrast, some equivariant MPNNs like NewtonNet [22], EGNN [20], or PaINN [21] express the features in Cartesian coordinates.…”
Section: Equivariant Messagessupporting
confidence: 64%
See 3 more Smart Citations
“…For the rest of the paper, we will assume that they are encoded in spherical coordinates. This is in line with many equivariant models such as SOAP-GAP [2], SNAP [34], ACE [5,35] and its recursive implementations such as [36], NICE [37], NequIP [26], equivariant transformer [24], and SEGNNs [25]. By contrast, some equivariant MPNNs like NewtonNet [22], EGNN [20], or PaINN [21] express the features in Cartesian coordinates.…”
Section: Equivariant Messagessupporting
confidence: 64%
“…17 and U t is the update function for each layer. In most MPNNs, the k channel of the message corresponds to the dimension of the learned embedding of the chemical elements [14,26]. We further need to extend Eq.…”
Section: Atomic Basismentioning
confidence: 99%
See 2 more Smart Citations
“…Priors of particular importance for 3D structure modeling include rotation [49] and translation [50] equivariance. Such priors form the basis of 3D Euclidean transformations that can now directly be found within neural network layers [51,52,12,53,54] to increase networks' data efficiency and generalization capabilities [55,56]. Our work follows that of [12] to incorporate E(3)-equivariance in our message passing neural network for 3D structure refinement and quality assessment.…”
Section: Related Workmentioning
confidence: 99%