2015
DOI: 10.12988/ams.2015.411995
|View full text |Cite
|
Sign up to set email alerts
|

A conjugate gradient method with inexact line search for unconstrained optimization

Abstract: In this paper, an efficient nonlinear modified PRP conjugate gradient method is presented for solving large-scale unconstrained optimization problems. The sufficient descent property is satisfied under strong Wolfe-Powell (SWP) line search by restricting the parameter 4 / 1  . The global convergence result is established under the (SWP) line search conditions. Numerical results, for a set consisting of 133 unconstrained optimization test problems, show that this method is better than the PRP method and the F… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…Two novel CGMs were proposed based on the new conjugacy requirement, one of which was found to be particularly efficient. A modified PRP CG approach was introduced by Mohammed et al (2015) to solve large-scale unconstrained optimization problems where the parameter is restricted and the sufficient descent property is satisfied under the strong Wolfe-Powell line search. Zhang (2009) introduced two new variants of Hestenes-Stiefel nonlinear CGM, based on the secant condition commonly met by quasi-Newton methods, which are descent methods even with inexact line searches.…”
Section: Review Of Related Literaturesmentioning
confidence: 99%
“…Two novel CGMs were proposed based on the new conjugacy requirement, one of which was found to be particularly efficient. A modified PRP CG approach was introduced by Mohammed et al (2015) to solve large-scale unconstrained optimization problems where the parameter is restricted and the sufficient descent property is satisfied under the strong Wolfe-Powell line search. Zhang (2009) introduced two new variants of Hestenes-Stiefel nonlinear CGM, based on the secant condition commonly met by quasi-Newton methods, which are descent methods even with inexact line searches.…”
Section: Review Of Related Literaturesmentioning
confidence: 99%
“…In this study, a new CG method with namely UAM k  (Ummie, Asrul and Mustafa) proposed by pursuing the excellent performance of coefficient proposed by [22], [7] in term of successful to solve the test problem. x  based on (2) 6.…”
Section: A Propose New Coefficientmentioning
confidence: 99%
“…During the last decade, much effort has been devoted to developing new modifications of conjugate gradient methods which do not only possess strong convergence properties, but they are also computationally superior to the classical methods. Such methods can be found in [18][19][20][21][22][23][24][25][26][27][28][29][30].…”
Section: New Formula For K  and Its Propertiesmentioning
confidence: 99%