2017
DOI: 10.1002/rnc.3829
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear–nonquadratic optimal and inverse optimal control for stochastic dynamical systems

Abstract: Summary In this paper, we develop a unified framework to address the problem of optimal nonlinear analysis and feedback control for nonlinear stochastic dynamical systems. Specifically, we provide a simplified and tutorial framework for stochastic optimal control and focus on connections between stochastic Lyapunov theory and stochastic Hamilton–Jacobi–Bellman theory. In particular, we show that asymptotic stability in probability of the closed‐loop nonlinear system is guaranteed by means of a Lyapunov functio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2

Relationship

2
4

Authors

Journals

citations
Cited by 17 publications
(14 citation statements)
references
References 32 publications
0
14
0
Order By: Relevance
“…Nonlinear Time-Varying Discrete Stochastic Systems. The stochastic control system has the characteristics of uncertainty [30], such as the autonomous control system of unmanned aerial vehicle (UAV) [31], which is affected by unstable wind [32], electromagnetic interference [33], noise signal [34], and so on in the process of operation, resulting in its control system with nonlinear, discrete, time-varying, stochastic, and other characteristics [35]. A stochastic system can be represented in the form of Equations (1) and (2) [36].…”
Section: Introductionmentioning
confidence: 99%
“…Nonlinear Time-Varying Discrete Stochastic Systems. The stochastic control system has the characteristics of uncertainty [30], such as the autonomous control system of unmanned aerial vehicle (UAV) [31], which is affected by unstable wind [32], electromagnetic interference [33], noise signal [34], and so on in the process of operation, resulting in its control system with nonlinear, discrete, time-varying, stochastic, and other characteristics [35]. A stochastic system can be represented in the form of Equations (1) and (2) [36].…”
Section: Introductionmentioning
confidence: 99%
“…In a recent paper by Rajpurohit and Haddad, the authors presented a framework for analyzing and designing feedback controllers for nonlinear stochastic dynamical systems. Specifically, a stochastic feedback control problem over an infinite horizon involving a nonlinear‐nonquadratic performance functional was considered and the performance functional was evaluated in closed form as long as the nonlinear‐nonquadratic cost functional considered was related in a specific way to an underlying Lyapunov function that guarantees asymptotic stability in probability of the nonlinear closed‐loop system.…”
Section: Introductionmentioning
confidence: 99%
“…The approach in the work of the aforementioned authors focused on the role of the Lyapunov function guaranteeing stochastic stability of the closed‐loop system and its connection to the steady‐state solution of the stochastic Hamilton‐Jacobi‐Bellman equation characterizing the optimal nonlinear feedback controller. In order to avoid the complexity in solving the stochastic steady‐state, Hamilton‐Jacobi‐Bellman equation, we do not attempt to minimize a given given cost functional, but rather, we parameterize a family of stochastically stabilizing controllers that minimizes a derived cost functional that provides the flexibility in specifying the control law.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations