2016
DOI: 10.1137/141001536
|View full text |Cite
|
Sign up to set email alerts
|

New Douglas--Rachford Algorithmic Structures and Their Convergence Analyses

Abstract: In this paper we study new algorithmic structures with Douglas-Rachford (DR) operators to solve convex feasibility problems. We propose to embed the basic two-set-DR algorithmic operator into the String-Averaging Projections (SAP) and into the Block-Iterative Projection (BIP) algorithmic structures, thereby creating new DR algorithmic schemes that include the recently proposed cyclic Douglas-Rachford algorithm and the averaged DR algorithm as special cases. We further propose and investigate a new multiple-set… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 26 publications
0
10
0
Order By: Relevance
“…In [20], it was shown that many convex minimization and monotone inclusion problems reduce to the more general problem of finding a fixed point of compositions of averaged operators, which provided a unified analysis of various proximal splitting algorithms. Along these lines, several fixed point methods based on various combinations of averaged operators have since been devised, see [1,2,5,9,11,13,14,17,18,24,25,38,46] for recent work. Motivated by deep neural network structures with thus far elusive asymptotic properties, we investigate in the present paper a novel averaged operator model involving a mix of nonlinear and linear operators.…”
Section: Introductionmentioning
confidence: 99%
“…In [20], it was shown that many convex minimization and monotone inclusion problems reduce to the more general problem of finding a fixed point of compositions of averaged operators, which provided a unified analysis of various proximal splitting algorithms. Along these lines, several fixed point methods based on various combinations of averaged operators have since been devised, see [1,2,5,9,11,13,14,17,18,24,25,38,46] for recent work. Motivated by deep neural network structures with thus far elusive asymptotic properties, we investigate in the present paper a novel averaged operator model involving a mix of nonlinear and linear operators.…”
Section: Introductionmentioning
confidence: 99%
“…String-averaging and block-iterative variants. In 2016, Censor and Mansour introduced the string-averaging DR (SA-DR) and block-iterative DR (BI-DR) variants [60]. SA-DR involves separating the index set I := {1, .…”
Section: Connection With Methods Of Multipliers (Admm)mentioning
confidence: 99%
“…This observation allows us to apply the acceleration technique to affine settings beyond projectors including the Douglas-Rachford variants studied in [7,8,13,3]. The simplest realisation is the symmetrised Douglas-Rachford algorithm consider below.…”
Section: Extensions To Firmly Nonexpansive Operatorsmentioning
confidence: 99%