Wiley Encyclopedia of Operations Research and Management Science 2011
DOI: 10.1002/9780470400531.eorms0183
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear Conjugate Gradient Methods

Abstract: Conjugate gradient methods are a class of important methods for solving linear equations and for solving nonlinear optimization. In this article, a review on conjugate gradient methods for unconstrained optimization is given. They are divided into early conjugate gradient methods, descent conjugate gradient methods, and sufficient descent conjugate gradient methods. Two general convergence theorems are provided for the conjugate gradient method assuming the descent property of each search direction. Some resea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
73
0
9

Year Published

2012
2012
2025
2025

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 97 publications
(82 citation statements)
references
References 73 publications
0
73
0
9
Order By: Relevance
“…This feature means that the PRP method has been one of the most efficient conjugate gradient methods in practical computation for many years. However, Dai 4 constructed an example to indicate that the PRP method may generate an upward direction resulting in the iterative scheme failing even if the objective function is uniformly convex under the strong Wolfe line search. So far, the convergence of the PRP method has not been completely proved under the Wolfe-type line search.…”
Section: Introductionmentioning
confidence: 99%
“…This feature means that the PRP method has been one of the most efficient conjugate gradient methods in practical computation for many years. However, Dai 4 constructed an example to indicate that the PRP method may generate an upward direction resulting in the iterative scheme failing even if the objective function is uniformly convex under the strong Wolfe line search. So far, the convergence of the PRP method has not been completely proved under the Wolfe-type line search.…”
Section: Introductionmentioning
confidence: 99%
“…Dai and Yuan [7] proved that the DY method is a descent method and globally convergent in the case of the standard Wolfe line search. However, the HS method and the PRP method may generate ascent directions even with the strong Wolfe line search [4], which prevent them from global convergence although both methods are regarded as two of the most efficient conjugate gradient methods in practical computation. To guarantee global convergence of the PRP method, some line searches which force it generate descent direction were proposed [4,14].…”
Section: Introductionmentioning
confidence: 99%
“…However, the HS method and the PRP method may generate ascent directions even with the strong Wolfe line search [4], which prevent them from global convergence although both methods are regarded as two of the most efficient conjugate gradient methods in practical computation. To guarantee global convergence of the PRP method, some line searches which force it generate descent direction were proposed [4,14]. Recently, by the use of an approximate descent backtracking line search, Zhou [25] showed that the original PRP method converges globally even for nonconvex functions whether the search direction is descent or not.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…If f is a strictly convex quadratic function, and the performed line search is exact, all these methods are equivalent, but for a general function different choices of β k give rise to distinct conjugate gradient methods with quite different computational efficiency and convergence properties. We refer to the books [7,8], the survey paper [9], and the references therein about the numerical performance and the convergence properties of conjugate gradient methods. During the last decade, much effort has been devoted to develop new conjugate gradient methods which are not only globally convergent for general functions but also computationally superior to classical methods and there are classified by two classes.…”
Section: Introductionmentioning
confidence: 99%