2019
DOI: 10.1007/s00245-019-09609-7
|View full text |Cite
|
Sign up to set email alerts
|

Random Minibatch Subgradient Algorithms for Convex Problems with Functional Constraints

Abstract: In this paper we consider non-smooth convex optimization problems with (possibly) infinite intersection of constraints. In contrast to the classical approach, where the constraints are usually represented as intersection of simple sets, which are easy to project onto, in this paper we consider that each constraint set is given as the level set of a convex but not necessarily differentiable function. For these settings we propose subgradient iterative algorithms with random minibatch feasibility updates. At eac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 26 publications
0
10
0
Order By: Relevance
“…Furthermore, in (Nedich, 2011) and (Nedich and Necoara, 2019) sublinear convergence rates are established either for convex or strongly convex deterministic objective functions, respectively, while in this paper we prove (sub)linear rates under an expected composite objective function which is either convex or satisfies relaxed strong convexity conditions. Moreover, (Nedich, 2011;Nedich and Necoara, 2019) present separately the convergence analysis for smooth and nonsmooth objective, while in this paper we present a unified convergence analysis covering both cases through the so-called stochastic bounded gradient condition. Hence, since we deal with stochastic composite objective functions, smooth or nonsmooth, and relaxed strong convexity assumptions, and since we consider a stochastic proximal gradient with new stepsize rules, our convergence analysis requires additional insights that differ from that of (Nedich, 2011;Nedich and Necoara, 2019).…”
Section: Introductionmentioning
confidence: 92%
See 4 more Smart Citations
“…Furthermore, in (Nedich, 2011) and (Nedich and Necoara, 2019) sublinear convergence rates are established either for convex or strongly convex deterministic objective functions, respectively, while in this paper we prove (sub)linear rates under an expected composite objective function which is either convex or satisfies relaxed strong convexity conditions. Moreover, (Nedich, 2011;Nedich and Necoara, 2019) present separately the convergence analysis for smooth and nonsmooth objective, while in this paper we present a unified convergence analysis covering both cases through the so-called stochastic bounded gradient condition. Hence, since we deal with stochastic composite objective functions, smooth or nonsmooth, and relaxed strong convexity assumptions, and since we consider a stochastic proximal gradient with new stepsize rules, our convergence analysis requires additional insights that differ from that of (Nedich, 2011;Nedich and Necoara, 2019).…”
Section: Introductionmentioning
confidence: 92%
“…However, the optimization problem, the algorithm and consequently the convergence analysis are different from the present paper. In particular, our algorithm is a stochastic proximal gradient extension of the algorithms proposed in (Nedich, 2011;Nedich and Necoara, 2019). Additionally, the stepsizes in (Nedich, 2011;Nedich and Necoara, 2019) are chosen decreasing, while in the present work for strongly like objective functions we derive insightful stepsize-switching rules which describe when one should switch from a constant to a decreasing stepsize regime.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations