2007
DOI: 10.1155/2007/18735
|View full text |Cite
|
Sign up to set email alerts
|

Optimal Control of Mechanical Systems

Abstract: In the present work, we consider a class of nonlinear optimal control problems, which can be called "optimal control problems in mechanics." We deal with control systems whose dynamics can be described by a system of Euler-Lagrange or Hamilton equations. Using the variational structure of the solution of the corresponding boundary-value problems, we reduce the initial optimal control problem to an auxiliary problem of multiobjective programming. This technique makes it possible to apply some consistent numeric… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2007
2007
2022
2022

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 18 publications
0
2
0
Order By: Relevance
“…With the advances of modern computers and the increasing emphasis on optimal design of large scale dynamical systems under scarce availability of resources, optimal control theory has become a useful tool for solving many engineering, industrial and management problems. For a brief selection, see [1,2,3,4,5,7,9,10,11,14,17,18,27,28,29,30,31,32,33,34,35,37,41,44,45,50,53,63,64,65,66]. The main theoretical tools for solving optimal control problems analytically are the famous Pontryagin's minimum principle [1,2,8,52] and the Hamilton-Jacobi-Bellman equation [4,59].…”
mentioning
confidence: 99%
“…With the advances of modern computers and the increasing emphasis on optimal design of large scale dynamical systems under scarce availability of resources, optimal control theory has become a useful tool for solving many engineering, industrial and management problems. For a brief selection, see [1,2,3,4,5,7,9,10,11,14,17,18,27,28,29,30,31,32,33,34,35,37,41,44,45,50,53,63,64,65,66]. The main theoretical tools for solving optimal control problems analytically are the famous Pontryagin's minimum principle [1,2,8,52] and the Hamilton-Jacobi-Bellman equation [4,59].…”
mentioning
confidence: 99%
“…For Euler-lagrange system without nonholonomic constraints, the dimension of inputs are often equal to the dimension of output and the system are often able to be transformed into a double integrator system by employing feedback linearization [ 12 ]. Other methods, such as control Lyapunov function method [ 13 ], passivity based method [ 14 ], optimal control method [ 15 ], etc., are also successfully applied to the control of Euler-lagrange system without nonholonomic constraints. In contrast, as the dimension of inputs is lower than that of outputs, it is often impossible to directly transform the Euler-lagrange system subject to nonholonomic constraints to a linear system and thus feedback linearization fails to stabilize the system.…”
Section: Introductionmentioning
confidence: 99%