2018
DOI: 10.48550/arxiv.1803.00718
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Smoothed Variable Sample-size Accelerated Proximal Methods for Nonsmooth Stochastic Convex Programs

Abstract: We consider minimizing f (x) = E[f (x, ω)] when f (x, ω) is possibly nonsmooth and either strongly convex or convex in x. (I) Strongly convex. When f (x, ω) is µ−strongly convex in x, traditional stochastic approximation (SA) schemes often display poor behavior, arising in part from noisy subgradients and diminishing steplengths. Instead, we propose a variable sample-size accelerated proximal scheme (VS-APM) and apply it on f η (x), the (η-)Moreau smoothed variant of E[f (x, ω)]; we term such a scheme as (η-VS… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
43
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
2

Relationship

5
1

Authors

Journals

citations
Cited by 15 publications
(43 citation statements)
references
References 26 publications
0
43
0
Order By: Relevance
“…While stochastic counterpart of the proximal gradient method (and its accelerated counterpart) have received much attention [31,32,33,34], stochastic generalizations of the proximal-point method have been less studied. Koshal, Nedić and Shanbhag [4] presented one of the first instances of a stochastic iterative proximal point method for strictly monotone stochastic variational inequality problems and provided almost sure convergence.…”
Section: Preliminaries On Proximal-point Schemesmentioning
confidence: 99%
See 4 more Smart Citations
“…While stochastic counterpart of the proximal gradient method (and its accelerated counterpart) have received much attention [31,32,33,34], stochastic generalizations of the proximal-point method have been less studied. Koshal, Nedić and Shanbhag [4] presented one of the first instances of a stochastic iterative proximal point method for strictly monotone stochastic variational inequality problems and provided almost sure convergence.…”
Section: Preliminaries On Proximal-point Schemesmentioning
confidence: 99%
“…Rate statements (available for stochastic optimization) are summarized in Table 1. When the operator T may be cast as the sum of two operators A and B, there has been significant study of splitting methods [38,39,40,41] when the expectation-valued operator is single-valued in the optimization regime [31,32,33,34,42] and more generally [43,44] when A is Lipschitz and expectation-valued while B is maximal monotone. Sample-average approximation techniques have also been developed [45,46] as a form of approximation framework.…”
Section: Preliminaries On Proximal-point Schemesmentioning
confidence: 99%
See 3 more Smart Citations