2018
DOI: 10.1007/978-3-319-97478-1_7
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Forward Douglas-Rachford Splitting Method for Monotone Inclusions

Abstract: We propose a stochastic Forward-Douglas-Rachford Splitting framework for finding a zero point of the sum of three maximally monotone operators in real separable Hilbert space, where one of the operators is cocoercive. We characterize the rate of convergence in expectation in the case of strongly monotone operators. We provide guidance on step-size sequences that achieve this rate, even if the strong convexity parameter is unknown.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2018
2018
2019
2019

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 13 publications
(21 citation statements)
references
References 31 publications
(68 reference statements)
0
21
0
Order By: Relevance
“…Moreover,task (4), in the case where J = 2, X := X 0 = X 1 = X 2 , H 1 = H 2 = Id, r 1 = r 2 = 0, and A := X , i.e., min x∈X [f(x) + g 1 (x) + g 2 (x)], has also attracted attention in the context of the "three-term operator splitting" framework [18,19]. As in [14,15,16], ∇f, Prox g 1 and Prox g 2 are employed via computationally efficient recursions in [18,19] to generate a sequence which converges weakly (and under certain hypotheses, strongly) to a solution of the minimization task at hand. All studies in [14,15,16,18,19] (1) in the case where X is a Euclidean space and A := {x ∈ X | a x = 0}, for some a ∈ X \ {0}, was treated, within a stochastic setting, in [20].…”
Section: Prior Artmentioning
confidence: 99%
See 4 more Smart Citations
“…Moreover,task (4), in the case where J = 2, X := X 0 = X 1 = X 2 , H 1 = H 2 = Id, r 1 = r 2 = 0, and A := X , i.e., min x∈X [f(x) + g 1 (x) + g 2 (x)], has also attracted attention in the context of the "three-term operator splitting" framework [18,19]. As in [14,15,16], ∇f, Prox g 1 and Prox g 2 are employed via computationally efficient recursions in [18,19] to generate a sequence which converges weakly (and under certain hypotheses, strongly) to a solution of the minimization task at hand. All studies in [14,15,16,18,19] (1) in the case where X is a Euclidean space and A := {x ∈ X | a x = 0}, for some a ∈ X \ {0}, was treated, within a stochastic setting, in [20].…”
Section: Prior Artmentioning
confidence: 99%
“…Theorems 3.1 and 3.6. Fixed-point theory, variational inequalities and affine-nonexpansive mappings are utilized to accommodate the affine constraint A in a more flexible way (see, e.g., Proposition 2.10 and Example A.4) than the usage of the indicator function and its associated metric-projection mapping that methods [15,16,18,19] promote. Such flexibility is combined with the first-order information of f and the proximal mapping of g to build recursions of tunable complexity that can score low-computational-complexity footprints, wellsuited for large-scale minimization tasks.…”
Section: Contributionsmentioning
confidence: 99%
See 3 more Smart Citations