2022
DOI: 10.48550/arxiv.2206.09450
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Data Augmentation vs. Equivariant Networks: A Theory of Generalization on Dynamics Forecasting

Abstract: Exploiting symmetry in dynamical systems is a powerful way to improve the generalization of deep learning. The model learns to be invariant to transformation, and hence is more robust to distribution shift. Data augmentation and equivariant networks are two major approaches to inject symmetry into learning. However, their exact role in improving generalization is not well understood. In this work, we derive the generalization bounds for data augmentation and equivariant networks, characterizing their effect on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 13 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?