2022
DOI: 10.48550/arxiv.2202.10670
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

From Optimization Dynamics to Generalization Bounds via Łojasiewicz Gradient Inequality

Abstract: Optimization and generalization are two essential aspects of machine learning. In this paper, we propose a framework to connect optimization with generalization by analyzing the generalization error based on the length of optimization trajectory under the gradient flow algorithm after convergence. Through our approach, we show that, with a proper initialization, gradient flow converges following a short path with an explicit length estimate. Such an estimate induces a length-based generalization bound, showing… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 6 publications
0
3
0
Order By: Relevance
“…Lastly, we particularly mention the work [25]. This work is related to ours since we both consider the connection between optimization and generalization by path estimate.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Lastly, we particularly mention the work [25]. This work is related to ours since we both consider the connection between optimization and generalization by path estimate.…”
Section: Related Workmentioning
confidence: 99%
“…Chaining and conditional MI are also explored to derive more accurate bounds [3,37]. Besides, other techniques and approaches used to bound generalization error include model compression [2], margin theory [21], path length estimate [25] and linear stability of optimization algorithms [28].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation