2020
DOI: 10.1063/5.0021116
|View full text |Cite
|
Sign up to set email alerts
|

Recursive evaluation and iterative contraction of N-body equivariant features

Abstract: Mapping an atomistic configuration to a symmetrized N-point correlation of a field associated with the atomic positions (e.g., an atomic density) has emerged as an elegant and effective solution to represent structures as the input of machine-learning algorithms. While it has become clear that low-order density correlations do not provide a complete representation of an atomic environment, the exponential increase in the number of possible N-body invariants makes it difficult to design a concise and effective … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
103
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 76 publications
(105 citation statements)
references
References 38 publications
2
103
0
Order By: Relevance
“… 78 One can go further in body order explicitly while continuing to keep the regression linear. 64 , 65 , 67 , 68 , 73 , 79 …”
Section: Learning Atomistic Propertiesmentioning
confidence: 99%
See 1 more Smart Citation
“… 78 One can go further in body order explicitly while continuing to keep the regression linear. 64 , 65 , 67 , 68 , 73 , 79 …”
Section: Learning Atomistic Propertiesmentioning
confidence: 99%
“…For each value of the training set size, we show the mean absolute error (MAE) evaluated on the test set which consists of the remaining structures from the full dataset. Models based on FCHL (2018), 233 SOAP (2018), 69 aSLATM, 234 Coulomb Matrix (CM), 235 and Bag-of-bonds (BOB) 236 representations use Gaussian process/kernel ridge regression, whereas NICE 73 and MTP 67 use linear ridge regression, and SchNet 237 and PhysNet 238 are graph neural networks. GM-sNN uses a representation similar in spirit to MTP but based on a Gaussian radial basis set and a feed-forward neural network for regression.…”
Section: Validation and Accuracymentioning
confidence: 99%
“… 16 We indicate these as the two-point density correlation to stress that similar results are to be expected from any equivalent local featurization 17 such as atom-centered symmetry functions, 78 SNAP, 79 MTP, 80 ACE, 72 NICE. 69 We report errors in terms of the root mean square error (RMSE), or the percentage RMSE (RMSE%), which is expressed as a percentage of the standard deviation of the target properties.…”
Section: Resultsmentioning
confidence: 99%
“…A particularly clean, efficient, recursive formulation can be derived exploiting the fact that the equivariant features behave as angular momenta, and can then be combined using Clebsch–Gordan coefficients to build higher-order correlations. 69 In analytical derivations we use a partially-discretized basis, in which the radial contribution is kept as a continuous index, corresponding to with 〈 x̂ | lm 〉 ≡ Y m l ( x̂ ). Written in this basis, 〈 arlm | ρ i 〉 expresses the decomposition of the density in independent angular momentum channels, evaluated at a distance r from the central atom.…”
Section: Multi-scale Equivariant Representationsmentioning
confidence: 99%
See 1 more Smart Citation