1984
DOI: 10.1080/00207178408933304
|View full text |Cite
|
Sign up to set email alerts
|

Comparison of optimization algorithms

Abstract: The measured convergence rates of several algorithms belonging to the class of . gradient methods in function space' are compared. The steepest descent, conjugate gradient and quasi-Newton algorithms, and the effect of different types of linear search routine and initial guess parameters on their convergence rates, arc considered. It is concluded that conjugate gradient methods do offer significant advantages, even for non-linear system models and bounded controls.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0
1

Year Published

1992
1992
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 34 publications
(14 citation statements)
references
References 16 publications
0
13
0
1
Order By: Relevance
“…in the non-negative orthant. For instance, in [ U ] a steplengtha is determined so that for Algorithm 1: (29) and for Algorithm 2: p ( a ) = p , + a ( A p " + h p ' ) +~~~A p~* (30) where we define p T = [xr,h7,vT,sT J along with similar notation for related terms. Using the less restrictive guidelines of [85], the largest steplength & E (OJ] is chosen that satisfies:…”
Section: Ip Methods Within Sqpmentioning
confidence: 99%
See 1 more Smart Citation
“…in the non-negative orthant. For instance, in [ U ] a steplengtha is determined so that for Algorithm 1: (29) and for Algorithm 2: p ( a ) = p , + a ( A p " + h p ' ) +~~~A p~* (30) where we define p T = [xr,h7,vT,sT J along with similar notation for related terms. Using the less restrictive guidelines of [85], the largest steplength & E (OJ] is chosen that satisfies:…”
Section: Ip Methods Within Sqpmentioning
confidence: 99%
“…Any of single shooting, multiple shooting, and global methods are used for the solution of the BVP (see Bryson and Ho [12] for details). The indirect method works well for unconstrained problems, but the presence of inequalities in the model poses difficulties for these methods (Jones and Finch [30], Ray [65]).…”
Section: Control and Optimizationmentioning
confidence: 99%
“…The sequential solution and optimization requires the solution of differential equations at each iteration of the optimization. Jones and Finch (1984) found that such methods spend about 85% of the time integrating the model equations in order to obtain gradient information. This can make the implementation of this algorithm computationally expensive for cases involving a large number of model equations.…”
Section: Sequential Solution and Optimization Algorithmmentioning
confidence: 99%
“…A subroutine is used for numerical integration of the state equations. A comparison of several CVI algorithms is given in [Jones84] and an extensive theoretical treatment can be found in [Goh88]. An alternative approach to CVI is iterative dynamic programming (IDP) , which relies on forward integration of the state equations, discretization of possible values of u(t) over a region and subsequent repeated region contraction [Luus93a,Luus93b].…”
Section: Piecewise Constant or Piecewise Linear U(t)mentioning
confidence: 99%