1970
DOI: 10.1051/m2an/197004r301541
|View full text |Cite
|
Sign up to set email alerts
|

Brève communication. Régularisation d'inéquations variationnelles par approximations successives

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

4
591
0
5

Year Published

2013
2013
2020
2020

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 589 publications
(600 citation statements)
references
References 2 publications
4
591
0
5
Order By: Relevance
“…Theorem 3.6 shows that the sequence {x k } is bounded. The following result establishes boundedness of {y k }, and hence of {x k } in view of (6), under the condition that y k+1 is chosen to beỹ k+1 at every iteration of the A-HPE framework.…”
Section: Endmentioning
confidence: 95%
See 1 more Smart Citation
“…Theorem 3.6 shows that the sequence {x k } is bounded. The following result establishes boundedness of {y k }, and hence of {x k } in view of (6), under the condition that y k+1 is chosen to beỹ k+1 at every iteration of the A-HPE framework.…”
Section: Endmentioning
confidence: 95%
“…method, proposed by Martinet [6], and further studied by Rockafellar [16,15], is a classical iterative scheme for solving the MI problem which generates a sequence {x k } according to x k = (λ k T + I) −1 (x k−1 ), It has been used as a generic framework for the design and analysis of several implementable algorithms. The classical inexact version of the proximal point method allows for the presence of a sequence of summable errors in the above iteration, i.e.…”
mentioning
confidence: 99%
“…We see that each step of iterates involves only with A as the forward step and B as the backward step, but not the sum of B. This method includes, in particular, the proximal point algorithm [4,6,14,20,27] and the gradient method [3,13]. Lions-Mercier [16] introduced the following splitting iterative methods in a real Hilbert space:…”
Section: Introductionmentioning
confidence: 99%
“…4.1 A question of interest in convex optimization concerns the strong convergence of the generalized proximal point method (GPPM for short) which emerged from the works of Martinet [43], [44], Rockafellar [52] and Censor and Zenios [21]. When applied to the consistent problem ( ) described in Subsection 3.1 the GPPM produces iterates according to the rule P { } …”
Section: Regularization Of a Proximal Point Methodsmentioning
confidence: 99%
“…In Section 4 we consider the question whether or under which conditions the generalized proximal point method for optimization which emerged from the works of Martinet [43], [44], Rockafellar [52] and Censor and Zenios [21] can be forced to converge strongly in infinite dimensional Banach spaces. The origin of this question can be traced back to Rockafellar's work [52].…”
mentioning
confidence: 99%