2022
DOI: 10.48550/arxiv.2211.08771
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the symmetries in the dynamics of wide two-layer neural networks

Abstract: We consider the idealized setting of gradient flow on the population risk for infinitely wide two-layer ReLU neural networks (without bias), and study the effect of symmetries on the learned parameters and predictors. We first describe a general class of symmetries which, when satisfied by the target function f * and the input distribution, are preserved by the dynamics. We then study more specific cases. When f * is odd, we show that the dynamics of the predictor reduces to that of a (non-linearly parameteriz… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 11 publications
(18 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?