2019
DOI: 10.1186/s13660-019-1958-1
|View full text |Cite
|
Sign up to set email alerts
|

Self-adaptive subgradient extragradient method with inertial modification for solving monotone variational inequality problems and quasi-nonexpansive fixed point problems

Abstract: In this paper, we introduce a new algorithm with self-adaptive method for finding a solution of the variational inequality problem involving monotone operator and the fixed point problem of a quasi-nonexpansive mapping with a demiclosedness property in a real Hilbert space. The algorithm is based on the subgradient extragradient method and inertial method. At the same time, it can be considered as an improvement of the inertial extragradient method over each computational step which was previously known. The w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 21 publications
0
8
0
Order By: Relevance
“…It is known that the class of nonexpansive mappings process the demiclosed principle. However, the class of quasinonexpansive mappings may do not process the demiclosed principle; see, e.g., [23,27].…”
Section: Preliminariesmentioning
confidence: 99%
“…It is known that the class of nonexpansive mappings process the demiclosed principle. However, the class of quasinonexpansive mappings may do not process the demiclosed principle; see, e.g., [23,27].…”
Section: Preliminariesmentioning
confidence: 99%
“…We test for: λ 0 = 0.6 and λ 0 = 0.2. We also compare the computational efficiency of the proposed algorithm with Algorithm 3.1 [24] and ISA-SEGM [23]. Choose x(0) = 0, x(1) = 0, µ = 0.5 and α n = 1 n .…”
Section: Computational Experimentsmentioning
confidence: 99%
“…Choose x(0) = 0, x(1) = 0, µ = 0.5 and α n = 1 n . We choose γ n = 5 for the proposed algorithm and γ n = 0.7 for ISA-SEGM [23]. In order to terminate the algorithm, we use the condition x n+1 − x * < ε where x * is the solution of the problem.…”
Section: Computational Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…If V I(C, A) = / 0 and λ ∈ (0, 1/L), the sequence {x n } generated by (1.3) converges weakly to an element of V I(C, A). Recently, this method has attracted attentions by many authors, see, e.g., [2,12,15,16,18] and the references therein. The second method for solving the variational inequality problems, the second projection onto the feasible set C replaced by a projection onto a special constructible half-space, is the so-called subgradient extragradient method by Censor, Gibali and Reich [3].…”
Section: Introductionmentioning
confidence: 99%