2022
DOI: 10.48550/arxiv.2201.11969
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Approximately Equivariant Networks for Imperfectly Symmetric Dynamics

Abstract: Incorporating symmetry as an inductive bias into neural network architecture has led to improvements in generalization, data efficiency, and physical consistency in dynamics modeling. Methods such as CNNs or equivariant neural networks use weight tying to enforce symmetries such as shift invariance or rotational equivariance. However, despite the fact that physical laws obey many symmetries, real-world dynamical data rarely conforms to strict mathematical symmetry either due to noisy or incomplete data or to s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 29 publications
0
9
0
Order By: Relevance
“…This is effective for sparsely sampled subgroups, such as rotation, but becomes less practical for more densely sampled groups such as translation where the local support could lead to very sparse feature maps. Lastly, Wang et al [28] propose a relaxed convolutional operator similar to our non-stationary approach, but require a low-rank factorization which potentially limits the expressivity of the feature maps. The same work briefly discusses a mathematical description without such factorization similar to our proposal which is deemed "too large a trainable weight space to be practical".…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…This is effective for sparsely sampled subgroups, such as rotation, but becomes less practical for more densely sampled groups such as translation where the local support could lead to very sparse feature maps. Lastly, Wang et al [28] propose a relaxed convolutional operator similar to our non-stationary approach, but require a low-rank factorization which potentially limits the expressivity of the feature maps. The same work briefly discusses a mathematical description without such factorization similar to our proposal which is deemed "too large a trainable weight space to be practical".…”
Section: Related Workmentioning
confidence: 99%
“…Symmetries play a central role in physics. In [28] approximate equivariances are used to allow more robust models for dynamical systems. In Noether networks [1] conservation laws are inferred by learning symmetries from data.…”
Section: Related Workmentioning
confidence: 99%
“…Thus, some work has recently explored the idea of building approximately equivariant models and empirically demonstrated the benefits of it in modeling real-world data (van der Ouderaa et al, 2022;Romero & Lohit, 2021;Finzi et al, 2021). For example, (Wang et al, 2022) designed approximately equivariant models by relaxing the weight sharing schemes in the equivariant convolution networks.…”
Section: Introductionmentioning
confidence: 99%
“…We show that when the underlying dynamics is symmetric, equivariant networks achieve a tighter generalization bound than data augmentation. Furthermore, when the symmetries in the data are only approximate, the generalization bound for approximately equivariant networks (Wang et al, 2022) is further improved.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation