2022
DOI: 10.48550/arxiv.2201.12250
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Gradient Descent on Neurons and its Link to Approximate Second-Order Optimization

Abstract: Second-order optimizers are thought to hold the potential to speed up neural network training, but due to the enormous size of the curvature matrix, they typically require approximations to be computationally tractable. The most successful family of approximations are Kronecker-Factored, block-diagonal curvature estimates (KFAC). Here, we combine tools from prior work to evaluate exact second-order updates with careful ablations to establish a surprising result: Due to its approximations, KFAC is not closely r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 33 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?