2017 Constructive Nonsmooth Analysis and Related Topics (Dedicated to the Memory of V.F. Demyanov) (CNSA) 2017
DOI: 10.1109/cnsa.2017.7973937
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive mirror descent for constrained optimization

Abstract: Abstract-This paper seeks to address how to solve non-smooth convex and strongly convex optimization problems with functional constraints. The introduced Mirror Descent (MD) method with adaptive stepsizes is shown to have a better convergence rate than MD with fixed stepsizes due to the improved constant. For certain types of constraints, the method is proved to generate dual solution. For the strongly convex case, the 'restart' technique is applied.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
1
1
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 10 publications
0
6
0
Order By: Relevance
“…where the equality is a consequence of the definition of t * and the inequality is obtained (5) and 9, we obtain f…”
Section: Algorithm 1 Feasible Level-set Methodsmentioning
confidence: 92%
See 2 more Smart Citations
“…where the equality is a consequence of the definition of t * and the inequality is obtained (5) and 9, we obtain f…”
Section: Algorithm 1 Feasible Level-set Methodsmentioning
confidence: 92%
“…where the first inequality follows by (5); the second by (6); the third by (5); the fourth using (18) with k = K; the first equality by the geometric sum; and the fifth inequality by dropping negative terms.…”
Section: Overall Iteration Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…In comparison to known algorithms in the literature, the main advantage of our method for solving (1) is that the stopping criterion does not require the knowledge of constants M f , M g , and, in this sense, the method is adaptive. Mirror Descent with stepsizes not requiring knowledge of Lipschitz constants can be found, e.g., in [7] for problems without inequality constraints, and, for constrained problems, in [5].The algorithm is similar to the one in [2], but, for the sake of consistency with other parts of the chapter, we use slightly different proof.…”
Section: Convex Non-smooth Objective Functionmentioning
confidence: 99%
“…The idea of restarting a method for convex problems to obtain faster rate of convergence for strongly convex problems dates back to 1980's, see [19,20]. The algorithm is similar to the one in [2], but, for the sake of consistency with other parts of the chapter, we use slightly different proof. To show that restarting algorithm is also possible for problems with inequality constraints, we rely on the following lemma.…”
Section: Strongly Convex Non-smooth Objective Functionmentioning
confidence: 99%