2005
DOI: 10.1016/j.cam.2004.06.018
|View full text |Cite
|
Sign up to set email alerts
|

A steepest descent method for vector optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

2
117
0
2

Year Published

2007
2007
2016
2016

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 181 publications
(121 citation statements)
references
References 14 publications
2
117
0
2
Order By: Relevance
“…Because of these applications, a lot of literature have been published to study optimality conditions, duality theories and topological properties of solutions of multiobjective optimization problems (see, e.g., [9,18,20,25] and the references therein). Recently some numerical methods for solving convex multiobjective optimization problems have been proposed in following papers: the steepest descent method for multiobjective optimization was dealt with in [14], the extensions of the projective gradient method to the case of convex constrained vector optimization can be found in [16,17]. Bonnel et al [5] constructed a vector-valued proximal point algorithm to investigate convex vector optimization problem in Hilbert space, they generalized the famous Rockafellar's results [23] from scalar case to vector case.…”
Section: Introductionmentioning
confidence: 98%
“…Because of these applications, a lot of literature have been published to study optimality conditions, duality theories and topological properties of solutions of multiobjective optimization problems (see, e.g., [9,18,20,25] and the references therein). Recently some numerical methods for solving convex multiobjective optimization problems have been proposed in following papers: the steepest descent method for multiobjective optimization was dealt with in [14], the extensions of the projective gradient method to the case of convex constrained vector optimization can be found in [16,17]. Bonnel et al [5] constructed a vector-valued proximal point algorithm to investigate convex vector optimization problem in Hilbert space, they generalized the famous Rockafellar's results [23] from scalar case to vector case.…”
Section: Introductionmentioning
confidence: 98%
“…Moreover, in the last 15 years many algorithms appeared aiming to solve new optimization-like problems (equilibrium, multiobjective, bilevel, order-value and many others) [4,5,10,11,15,19,22]. Punctual necessary optimality conditions have been encountered for many of these problems but, frequently, their algorithmic consequences are not clear.…”
Section: Final Remarksmentioning
confidence: 99%
“…Assume that x k → x * . If x k 1 > 0, the set Ω k defined by (18,19,20) is the intersection of the half-space x 1 ≥ 0 with the tangent line to h(…”
Section: Approximate Gradient Projection Conditionsmentioning
confidence: 99%