2010
DOI: 10.1007/s11590-010-0224-8
|View full text |Cite
|
Sign up to set email alerts
|

Global convergence of some modified PRP nonlinear conjugate gradient methods

Abstract: Recently, similar to Hager and Zhang (SIAM J Optim 16:170-192, 2005), Yu (Nonlinear self-scaling conjugate gradient methods for large-scale optimization problems. Thesis of Doctors Degree, Sun Yat-Sen University, 2007) and Yuan (Optim Lett 3:11-21, 2009) proposed modified PRP conjugate gradient methods which generate sufficient descent directions without any line searches. In order to obtain the global convergence of their algorithms, they need the assumption that the stepsize is bounded away from zero. In thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0
2

Year Published

2011
2011
2020
2020

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(14 citation statements)
references
References 17 publications
0
12
0
2
Order By: Relevance
“…To solve this problem, there have been many algorithms such as DPRP (Polak and Ribiere, 1969), NMPRP (Dai et al, 2000), DMPRP (Yu, 2007) and PRP-DC (Yuan, 2009) corrected for PRP, and analyzed the convergence (Andrei, 2011;Zhang, Zhou, and Li, 2006). Recently, Zhang et al (Dai and Tian, 2011) also proposed a modified PRP method satisfies sufficient descent, numerical results show that this method are global convergence both on uniformly convex functions and general nonlinear functions. Inspired by the thought of literature (Dai and Tian, 2011), this paper presents a modified PRP conjugate gradient (MPRPCG) algorithm, which uses learning algorithm based on MPRPCG to modify the five sets of parameters in the QNN model: quantum rotation gate phase θi , hidden layer connection weights argument βij , hidden layer activity value argument γij , threshold γij, and the output layer connection weights wjk.…”
Section: Learning Algorithms and Global Convergence Analysismentioning
confidence: 99%
“…To solve this problem, there have been many algorithms such as DPRP (Polak and Ribiere, 1969), NMPRP (Dai et al, 2000), DMPRP (Yu, 2007) and PRP-DC (Yuan, 2009) corrected for PRP, and analyzed the convergence (Andrei, 2011;Zhang, Zhou, and Li, 2006). Recently, Zhang et al (Dai and Tian, 2011) also proposed a modified PRP method satisfies sufficient descent, numerical results show that this method are global convergence both on uniformly convex functions and general nonlinear functions. Inspired by the thought of literature (Dai and Tian, 2011), this paper presents a modified PRP conjugate gradient (MPRPCG) algorithm, which uses learning algorithm based on MPRPCG to modify the five sets of parameters in the QNN model: quantum rotation gate phase θi , hidden layer connection weights argument βij , hidden layer activity value argument γij , threshold γij, and the output layer connection weights wjk.…”
Section: Learning Algorithms and Global Convergence Analysismentioning
confidence: 99%
“…Furthermore, if β k in (6) is specified by an existing conjugate gradient formula, we obtain the corresponding modified conjugate gradient method [9,10,22,[32][33][34][35]. Recently, researchers [7,[22][23][24][25]30] paid special attention to hybridize the above two approaches.…”
Section: Introductionmentioning
confidence: 99%
“…Zhang et al [11]propose a modified Polak-Ribiere-Polyak (PRP) conjugate gradient method with Armijo-type line search, then gain the global convergence property with sufficient descent direction. Dai and Tian [12] take a little modification to Zhang's method such that the modified method retain sufficient descent property. Without requirement of the positive lower bound of the step size, they prove that the proposed methods are globally convergent.…”
Section: Introductionmentioning
confidence: 99%