2016
DOI: 10.1080/10556788.2016.1225213
|View full text |Cite
|
Sign up to set email alerts
|

A new spectral conjugate gradient method for large-scale unconstrained optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
15
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 45 publications
(15 citation statements)
references
References 24 publications
0
15
0
Order By: Relevance
“…Obviously, the search direction d k generated by Algorithm 2 satisfies the sufficient descent condition (13). Therefore, if the stepsize α k is calculated by the Wolfe-Powell line search 1and 2, then the Zoutendijk condition (16) also holds for Algorithm 2.…”
Section: An Improved Version Of Lstt (Lstt+)mentioning
confidence: 99%
See 1 more Smart Citation
“…Obviously, the search direction d k generated by Algorithm 2 satisfies the sufficient descent condition (13). Therefore, if the stepsize α k is calculated by the Wolfe-Powell line search 1and 2, then the Zoutendijk condition (16) also holds for Algorithm 2.…”
Section: An Improved Version Of Lstt (Lstt+)mentioning
confidence: 99%
“…In recent years, based on the above classical formulas and line searches, many variations of CG methods have been proposed, including spectral CG methods [12,13], hybrid CG methods [14,15], and three-term CG methods [16,17]. Among them, the three-term CG methods seem to attract more attention, and a great deal of efforts has been devoted to developing this kind of methods, see, e.g., [18][19][20][21][22][23].…”
Section: Introductionmentioning
confidence: 99%
“…Case (ii): For k ∈ I 2 , from (21) we have T k d k = − k 2 . By setting this equation and (5) in (25) imply that…”
Section: (H2)mentioning
confidence: 99%
“…Due to the strong convergence of the Newton method, Andrei [1] proposed an accelerated conjugate gradient method, which took advantage of the Newton method to improve the performance of the conjugate gradient method. Following this idea, Parvaneh et al [24] proposed a new SCG, which is a modified version of the method suggested by Jian et al [15]. Masoud [21] introduced a scaled conjugate gradient method which inherited the good properties of the classical conjugate gradient.…”
Section: Introductionmentioning
confidence: 99%