2012
DOI: 10.1016/j.amc.2011.12.091
|View full text |Cite
|
Sign up to set email alerts
|

Another improved Wei–Yao–Liu nonlinear conjugate gradient method with sufficient descent property

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
51
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 55 publications
(51 citation statements)
references
References 19 publications
0
51
0
Order By: Relevance
“…In numerical result we compare the three-term BZAU method with TMPRP1 method [32]. As in [32] the TMPRP1 method is shown to be numerically efficient when it comes to comparison with other two robust methods such as CG Descent method [41] and DTPRP method [42]. That is the reason for comparing our BZAU method with TMPRP1 method.…”
Section: Resultsmentioning
confidence: 99%
“…In numerical result we compare the three-term BZAU method with TMPRP1 method [32]. As in [32] the TMPRP1 method is shown to be numerically efficient when it comes to comparison with other two robust methods such as CG Descent method [41] and DTPRP method [42]. That is the reason for comparing our BZAU method with TMPRP1 method.…”
Section: Resultsmentioning
confidence: 99%
“…In [23] relied on WYL to suggest a new conjugate gradient method, NPRP, and he proved that the NPRP method satisfied descent condition and the Global convergence property under strong Wolfe line search. Subsequently, in [24] introduced an improved NPRP method known as the DPRP method. In this section, enlightened by previous ideas [22][23][24], we propose our which known as .…”
Section: New Formula For and Its Propertiesmentioning
confidence: 99%
“…0.20 2 (27) to estimate the parameters of this system, the parameter estimates and their errors are shown in Table 1 and Fig. 3 with the data length L e = 2,000, where the parameter estimation errors are defined by δ := θ k − θ / θ .…”
Section: Examplesmentioning
confidence: 99%
“…The stochastic gradient (SG) algorithms include the conjugate gradient algorithms [2], the alternative gradient algorithms [41], the gradient projection algorithms [1], and the steepest descent algorithms [20]. Two typical algorithms are the multi-innovation SG algorithm and the gradient-based iterative algorithm [8,10].…”
mentioning
confidence: 99%