2010
DOI: 10.1007/s10957-010-9667-4
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid Approximate Proximal Method with Auxiliary Variational Inequality for Vector Optimization

Abstract: Abstract. This paper studies the general vector optimization problem of finding weakly efficient points for mappings in a Banach space Y, with respect to the partial order induced by a closed, convex, and pointed cone C C Y with nonempty interior. In order to find a solution of this problem, we introduce an auxiliary variational inequality problem for monotone, Lipschitz-continuous mapping. The approximate proximal method in vector optimization is extended to develop a hybrid approximate proximal method for th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2011
2011
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 63 publications
(22 citation statements)
references
References 27 publications
0
22
0
Order By: Relevance
“…The hybrid proximal extragradient (HPE) method of [3] has been used in the last few years as a framework for the analysis and development of several algorithms for solving monotone inclusion, saddle-point and convex optimization problems [3,4,5,6,7,8,9,10,11,12,13,14]. Next we present the HPE method.…”
Section: On the Hybrid Proximal Extragradient Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The hybrid proximal extragradient (HPE) method of [3] has been used in the last few years as a framework for the analysis and development of several algorithms for solving monotone inclusion, saddle-point and convex optimization problems [3,4,5,6,7,8,9,10,11,12,13,14]. Next we present the HPE method.…”
Section: On the Hybrid Proximal Extragradient Methodsmentioning
confidence: 99%
“…The hybrid proximal extragradient (HPE) method of Solodov and Svaiter [3] is an inexact version of the Rockafellar's PPM which uses relative error tolerance criterion for solving each proximal subproblem instead of summable error condition. The HPE method has been used for many authors [3,4,5,6,7,8,9,10,11,12,13,14] as a framework for the design and analysis of several algorithms for monotone inclusion problems, variational inequalities, saddlepoint problems and convex optimization. Its iteration-complexity has been established recently by Monteiro and Svaiter [15] and, as a consequence, it has proved the iteration-complexity of various important algorithms in optimization (which use the HPE method as a framework) including Tseng's forward-backward method, Korpelevich extragradient method and the alternating direction method of multipliers (ADMM) [12,15,16].…”
Section: Introductionmentioning
confidence: 99%
“…Later, Ceng and Yao [6] generalized the results in [4] to the approximate ones as well as discussed an extension to Bregman function based proximal algorithms for solving a weak efficient solution of (1.3). Recently, the authors in [5] introduced and studied a certain hybrid approximate proximal method of finding weak efficient solutions to the convex constrained vector optimization problem by utilizing and developing iterative processes from the fixed-point theory for nonexpansive operators and combining them with algorithms for solving some classes of monotone variational inequalities. Another results in this direction are given in [9], where the authors developed new hybrid approximate proximal-type algorithms to find efficient (or Pareto) solutions to problems of the convex constrained vector optimization in both finitedimensional and infinite-dimensional Hilbert spaces.…”
Section: T D Chuongmentioning
confidence: 99%
“…These types of problems have applications in the economy, industry, agriculture, and other fields; see [13]. [6] considered extensions of the proximal point method to the multiobjective setting, see also, [1,2,3,4,7,8,9,20] and references therein.…”
Section: Introductionmentioning
confidence: 99%