2014
DOI: 10.1007/s10898-014-0150-x
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid method without extrapolation step for solving variational inequality problems

Abstract: In this paper, we introduce a new method for solving variational inequality problems with monotone and Lipschitz-continuous mapping in Hilbert space. The iterative process is based on two well-known projection method and the hybrid (or outer approximation) method. However we do not use an extrapolation step in the projection method. The absence of one projection in our method is explained by slightly different choice of sets in hybrid method. We prove a strong convergence of the sequences generated by our meth… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

3
41
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 117 publications
(44 citation statements)
references
References 24 publications
3
41
0
Order By: Relevance
“…However, in infinite dimensional Hilbert spaces, the extragradient method only converges weakly. In recent years, the extragradient method has received a lot of attention, see, for example, [10,14,15,24,30,31] and the references therein. Nadezhkina and Takahashi [32] introduced the following hybrid extragradient method ⎧ ⎪ ⎪ ⎪ ⎪ ⎪ ⎨ ⎪ ⎪ ⎪ ⎪ ⎪ ⎩ y n = P K (x n − λA(x n )), z n = P K (x n − λA(y n )), C n = {z ∈ C : ||z − z n || ≤ ||z − x n ||} , Q n = {z ∈ C : x 0 − x n , z − x n ≤ 0} , x n+1 = P C n ∩Q n (x 0 ), (4) where λ ∈ (0, 1 L ).…”
Section: Introductionmentioning
confidence: 99%
“…However, in infinite dimensional Hilbert spaces, the extragradient method only converges weakly. In recent years, the extragradient method has received a lot of attention, see, for example, [10,14,15,24,30,31] and the references therein. Nadezhkina and Takahashi [32] introduced the following hybrid extragradient method ⎧ ⎪ ⎪ ⎪ ⎪ ⎪ ⎨ ⎪ ⎪ ⎪ ⎪ ⎪ ⎩ y n = P K (x n − λA(x n )), z n = P K (x n − λA(y n )), C n = {z ∈ C : ||z − z n || ≤ ||z − x n ||} , Q n = {z ∈ C : x 0 − x n , z − x n ≤ 0} , x n+1 = P C n ∩Q n (x 0 ), (4) where λ ∈ (0, 1 L ).…”
Section: Introductionmentioning
confidence: 99%
“…Inspired by Malitsky and Semenov' results [17], we propose the Algorithm 3.1 to extend the Algorithm 1.2 from Hilbert spaces to Banach spaces and prove a strong convergence theorem, which is different from the scheme proposed by Nakajo [2].…”
Section: Resultsmentioning
confidence: 99%
“…This might seriously affect the efficiency of the Algorithm 1.1. In this paper, we will construct a new iterative algorithm based on the idea in [17] as follows.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The convergence without problem modification is provided in iterative extragradient methods first proposed by Korpelevich in [21]. These methods were analyzed in many studies [22][23][24][25][26][27][28][29][30][31][32][33][34]. For variational inequalities and equilibrium programming problems, modifications of the Korpelevich algorithm with one metric projection onto feasible set were proposed [27,28].…”
Section: Introductionmentioning
confidence: 99%