2022
DOI: 10.48550/arxiv.2202.06114
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Gradient-Free Optimization for Non-Smooth Saddle Point Problems under Adversarial Noise

Abstract: In this paper, we study gradient-free methods for non-smooth convex-concave minimax problems. We assume that a black-box gradient-free oracle returns an inexact function value corrupted by adversarial noise. We prove that a zeroth-order version of the mirror descent method with inexact oracle is optimal in terms of the oracle calls complexity and the maximum value of adversarial noise.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…A similar approach, utilizing different zeroth-order oracles, was developed and analyzed in [46]. In general, there is a plethora of zeroth-order optimization algorithms, and the interested reader is referred to [4,10,15,26,34], and the references therein.…”
Section: Introductionmentioning
confidence: 99%
“…A similar approach, utilizing different zeroth-order oracles, was developed and analyzed in [46]. In general, there is a plethora of zeroth-order optimization algorithms, and the interested reader is referred to [4,10,15,26,34], and the references therein.…”
Section: Introductionmentioning
confidence: 99%