2021
DOI: 10.48550/arxiv.2104.13790
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FastAdaBelief: Improving Convergence Rate for Belief-based Adaptive Optimizers by Exploiting Strong Convexity

Yangfan Zhou,
Kaizhu Huang,
Cheng Cheng
et al.

Abstract: The AdaBelief algorithm demonstrates superior generalization ability to the Adam algorithm by viewing the exponential moving average of observed gradients. AdaBelief is proved to have a data-dependent O( √ T ) regret bound when objective functions are convex, where T is a time horizon. However, it remains to be an open problem on how to exploit strong convexity to further improve the convergence rate of AdaBelief. To tackle this problem, we present a novel optimization algorithm called FastAdaBelief that can e… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 8 publications
(9 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?