2015
DOI: 10.1155/2015/103517
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Hybrid Conjugate Gradient Method with the Strong Wolfe-Powell Line Search

Abstract: Conjugate gradient (CG) method is an interesting tool to solve optimization problems in many fields, such as design, economics, physics, and engineering. In this paper, we depict a new hybrid of CG method which relates to the famous Polak-Ribière-Polyak (PRP) formula. It reveals a solution for the PRP case which is not globally convergent with the strong Wolfe-Powell (SWP) line search. The new formula possesses the sufficient descent condition and the global convergent properties. In addition, we further expla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
8
1

Relationship

3
6

Authors

Journals

citations
Cited by 19 publications
(20 citation statements)
references
References 19 publications
0
20
0
Order By: Relevance
“…The parameters of the regridding-based reconstruction method included the Kaiser-Bessel convolution kernel [ 13 ] with a window width of 2, β = 18.5547, and an oversampling ratio of 2. For the self-adapting CS reconstruction method, we set λ = 0.05, the adaption coefficient to 0.6, the initial search step size to 1, and the maximum number of iterations to 100, and used a Wolfe line search [ 14 ].…”
Section: Simulation Experimentsmentioning
confidence: 99%
“…The parameters of the regridding-based reconstruction method included the Kaiser-Bessel convolution kernel [ 13 ] with a window width of 2, β = 18.5547, and an oversampling ratio of 2. For the self-adapting CS reconstruction method, we set λ = 0.05, the adaption coefficient to 0.6, the initial search step size to 1, and the maximum number of iterations to 100, and used a Wolfe line search [ 14 ].…”
Section: Simulation Experimentsmentioning
confidence: 99%
“…Therefore, PRP method is the most efficient method when it is compared to the other conjugate gradient methods. For more, the reader can see the following references [14][15][16][17][18][19].…”
Section: Abstract and Applied Analysismentioning
confidence: 99%
“…x Starting from an initial guess 0 where 0 1 σ δ < < < , is to find an approximation of k α where the descent property must be satisfied and no longer searching in the direction when k x is far from the solution. Thus by strong Wolfe line search conditions we in herit the advantages of exact line search with inexpensive and low computational cost [5]. The search direction k d is generated by: These methods are identical when f is a strongly convex quadratic function and the line search is exact, since the gradient are mutually orthogonal, and the parameters k β in these methods are equal.…”
Section: Introductionmentioning
confidence: 99%