2000
DOI: 10.1007/s101079900113
|View full text |Cite
|
Sign up to set email alerts
|

Forcing strong convergence of proximal point iterations in a Hilbert space

Abstract: Abstract. This paper concerns with convergence properties of the classical proximal point algorithm for finding zeroes of maximal monotone operators in an infinite-dimensional Hilbert space. It is well known that the proximal point algorithm converges weakly to a solution under very mild assumptions. However, it was shown by Güler [11] that the iterates may fail to converge strongly in the infinite-dimensional case. We propose a new proximal-type algorithm which does converge strongly, provided the problem ha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
151
0

Year Published

2010
2010
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 292 publications
(154 citation statements)
references
References 30 publications
2
151
0
Order By: Relevance
“…By (13), (14) and the estimate ||x n −ȳ n || ≤ ||x n − x n+1 || + ||x n+1 −ȳ n ||, we get lim n→∞ x n −ȳ n = 0. From the definition of i n , we obtain (15) lim n→∞ x n − y i n = 0 for all i = 1, 2, .…”
Section: Resultsmentioning
confidence: 87%
“…By (13), (14) and the estimate ||x n −ȳ n || ≤ ||x n − x n+1 || + ||x n+1 −ȳ n ||, we get lim n→∞ x n −ȳ n = 0. From the definition of i n , we obtain (15) lim n→∞ x n − y i n = 0 for all i = 1, 2, .…”
Section: Resultsmentioning
confidence: 87%
“…Condition (4) is an assumption on the whole generated sequence { Xn} and the error term sequence {en}, and thus seems to be slightly stronger, but it can be checked and enforced in practice more easily than those that existed earlier. Furthermore, da Silva e Silva et al [14] and Solodov and Svaiter [15][16][17] very recently proposed some new accurate criteria for proximal point algorithms. Their criteria, rather than the imposed inequality (3), require only that SUPn>O O"n < 1.…”
Section: (X) -mentioning
confidence: 99%
“…On the other hand, He [9] gave another inexact criterion for the study of monotone general variational inequalities, which involves a relation between the error term and the residual function; in other words, the restriction 2:;;o=O O"n < oo in (3) is replaced by He's assumption 2:;;o=O O"~ < oo. However, in [15][16][17] this comes at the cost of adding an additional projection or "extragradient" step to the algorithm, and the applicable portion of [14] is efficient only for convex minimization.…”
Section: (X) -mentioning
confidence: 99%
“…By contrast to the regularization method of GPPM proposed in [57] which, in Hilbert spaces, produces strongly convergent sequences whose limits are the projection of their initial points onto the set of optima of ( ), the sequences resulting from the regularized version of GPPM proposed here converge strongly to the minimal norm solution of ( ).…”
Section: Convergence and Stability Of A Regularization Methods For Maxmentioning
confidence: 94%
“…In infinite dimensional Hilbert spaces repeated attempts were recently made in order to discover how the problem data should be in order to ensure that the generalized proximal point method converges weakly or strongly under error perturbations (see [8], [9], [15], [30] [57]. The regularized generalized proximal point method we propose in Section 4 works in non Hilbertian spaces too.…”
mentioning
confidence: 99%