2020
DOI: 10.1609/aaai.v34i04.6081
|View full text |Cite
|
Sign up to set email alerts
|

Adapting to Smoothness: A More Universal Algorithm for Online Convex Optimization

Abstract: We aim to design universal algorithms for online convex optimization, which can handle multiple common types of loss functions simultaneously. The previous state-of-the-art universal method has achieved the minimax optimality for general convex, exponentially concave and strongly convex loss functions. However, it remains an open problem whether smoothness can be exploited to further improve the theoretical guarantees. In this paper, we provide an affirmative answer by developing a novel algorithm, namely UFO,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
14
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(14 citation statements)
references
References 12 publications
0
14
0
Order By: Relevance
“…One milestone is MetaGrad of van Erven and Koolen (2016), which can handle general convex functions as well as expconcave functions. Later, Wang et al (2019) propose Maler, which further supports strongly convex functions explicitly. In a subsequent work, Wang et al (2020b) develop UFO, which exploits smoothness to deliver small-loss regret bounds, i.e., regret bounds that depend on the minimal loss.…”
Section: Introductionmentioning
confidence: 80%
See 4 more Smart Citations
“…One milestone is MetaGrad of van Erven and Koolen (2016), which can handle general convex functions as well as expconcave functions. Later, Wang et al (2019) propose Maler, which further supports strongly convex functions explicitly. In a subsequent work, Wang et al (2020b) develop UFO, which exploits smoothness to deliver small-loss regret bounds, i.e., regret bounds that depend on the minimal loss.…”
Section: Introductionmentioning
confidence: 80%
“…Later, Wang et al (2019) propose Maler, which further supports strongly convex functions explicitly. In a subsequent work, Wang et al (2020b) develop UFO, which exploits smoothness to deliver small-loss regret bounds, i.e., regret bounds that depend on the minimal loss. However, the three aforementioned methods need to design one surrogate loss for each possible type of functions, which is both tedious and challenging.…”
Section: Introductionmentioning
confidence: 80%
See 3 more Smart Citations