2020
DOI: 10.2298/fil2007367a
|View full text |Cite
|
Sign up to set email alerts
|

Proximal point algorithm for differentiable quasi-convex multiobjective optimization

Abstract: The main aim of this paper is to consider the proximal point method for solving multiobjective optimization problem under the differentiability, locally Lipschitz and quasi-convex conditions of the objective function. The control conditions to guarantee that the accumulation points of any generated sequence, are Pareto critical points are provided.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…where F ∈ R n×n , G ∈ R m×m are symmetric positive semidefinite matrices and f ∈ R n , g ∈ R m are the known vectors. The class of convex minimization problems arises in many areas of computational science and engineering applications such as compressed sensing [1], financial [2,3], image restoration [4][5][6], network optimization problems [7,8], and traffic planning convex problems [9][10][11][12]. The model ( 1)-(2) captures many applications in different areas-see the l1-norm regularized least-squares problems in [12,13], the total variation image restoration in [13][14][15][16], and the standard quadratic programming problems in [7,13].…”
Section: Introductionmentioning
confidence: 99%
“…where F ∈ R n×n , G ∈ R m×m are symmetric positive semidefinite matrices and f ∈ R n , g ∈ R m are the known vectors. The class of convex minimization problems arises in many areas of computational science and engineering applications such as compressed sensing [1], financial [2,3], image restoration [4][5][6], network optimization problems [7,8], and traffic planning convex problems [9][10][11][12]. The model ( 1)-(2) captures many applications in different areas-see the l1-norm regularized least-squares problems in [12,13], the total variation image restoration in [13][14][15][16], and the standard quadratic programming problems in [7,13].…”
Section: Introductionmentioning
confidence: 99%