2015
DOI: 10.1016/j.apm.2014.08.008
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid conjugate gradient method with descent property for unconstrained optimization

Abstract: a b s t r a c tIn this paper, based on some famous previous conjugate gradient methods, a new hybrid conjugate gradient method was presented for unconstrained optimization. The proposed method can generate decent directions at every iteration, moreover, this property is independent of the steplength line search. Under the Wolfe line search, the proposed method possesses global convergence. Medium-scale numerical experiments and their performance profiles are reported, which show that the proposed method is pro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 56 publications
(21 citation statements)
references
References 27 publications
0
21
0
Order By: Relevance
“…Since Problem (34) is a standard smooth optimization problem, many powerful optimization techniques can be directly applied to solve it (see [37][38][39][40]).…”
Section: Definitionmentioning
confidence: 99%
“…Since Problem (34) is a standard smooth optimization problem, many powerful optimization techniques can be directly applied to solve it (see [37][38][39][40]).…”
Section: Definitionmentioning
confidence: 99%
“…Quite recently, based on the ideas of hybrid methods, Jian et al [16] introduced a new hybrid choice for parameter β k as follows:…”
Section: Preliminariesmentioning
confidence: 99%
“…It is well-known that the FR method and the DY method usually possess nice convergence property but have poor numerical performance. On the contrary, the PRP method and the HS method are generally regarded to be two of the most efficient conjugate gradient methods, but their convergence properties are not so good [16]. To the best of our knowledge, many conjugate gradient type methods for (2) need the restart technique [7,17,18].…”
Section: Introductionmentioning
confidence: 98%
“…Thus, to overcome these shortcomings of the classical CGMs, many researchers pay great attention to improve the CGMs. As a result, many improvements with excellent theoretical properties and numerical performance of the CGMs results were proposed, for example, References [7][8][9][10][11][12][13][14][15][16][17][18][19][20]. Where, the spectral conjugate gradient method (SCGM) proposed by Birgin and Martinez [12] can be seen as an important development of CGM.…”
Section: Introductionmentioning
confidence: 99%
“…and fully absorbing the hybrid idea of Reference [11], we propose a new conjugate parameter in the following manner…”
Section: Introductionmentioning
confidence: 99%