2020
DOI: 10.48550/arxiv.2002.12880
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Generalizing Convolutional Neural Networks for Equivariance to Lie Groups on Arbitrary Continuous Data

Abstract: The translation equivariance of convolutional layers enables convolutional neural networks to generalize well on image problems. While translation equivariance provides a powerful inductive bias for images, we often additionally desire equivariance to other transformations, such as rotations, especially for non-image data. We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group with a surjective exponential map. Incorporating equivarian… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
50
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(52 citation statements)
references
References 21 publications
2
50
0
Order By: Relevance
“…Equivariant DL models have achieved remarkable success in learning image data (Cohen et al, 2019;Weiler & Cesa, 2019b;Cohen & Welling, 2016a;Chidester et al, 2018;Lenc & Vedaldi, 2015;Kondor & Trivedi, 2018;Bao & Song, 2019;Worrall et al, 2017;Cohen & Welling, 2016b;Finzi et al, 2020;Weiler et al, 2018b;Dieleman et al, 2016;Ghosh & Gupta, 2019;Sosnovik et al, 2020b).…”
Section: Equivariance and Invariancementioning
confidence: 99%
“…Equivariant DL models have achieved remarkable success in learning image data (Cohen et al, 2019;Weiler & Cesa, 2019b;Cohen & Welling, 2016a;Chidester et al, 2018;Lenc & Vedaldi, 2015;Kondor & Trivedi, 2018;Bao & Song, 2019;Worrall et al, 2017;Cohen & Welling, 2016b;Finzi et al, 2020;Weiler et al, 2018b;Dieleman et al, 2016;Ghosh & Gupta, 2019;Sosnovik et al, 2020b).…”
Section: Equivariance and Invariancementioning
confidence: 99%
“…The work of [10] also showed that by exploiting the Euler-Lagrange equation, it is possible to predict a controlled double pendulum -a system pertinent for controlled robots. A recent advance shows that Hamiltonian and Lagrangian NNs can be drastically improved if they are optimized over Cartesian coordinates with holonomic constraints [29].…”
Section: Related Workmentioning
confidence: 99%
“…For CNN we used 3 × 3 kernels and equivalently used n L = 9 for the number of L i in L-conv and random L-conv. We also used "LieConv" Finzi et al (2020) as a baseline (Fig. 10, brown).…”
Section: Experiments On Imagesmentioning
confidence: 99%
“…For the symmetry group in LieConv we used SE(3). We also used the default ResNet architecture provided by Finzi et al (2020) for both the one and two layer experiments. We turned off batch normalization, consistent with other experiments.…”
Section: Experiments On Imagesmentioning
confidence: 99%