2017
DOI: 10.1137/14096815x
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Methods for the Approximation of Nonlinear Systems

Abstract: We introduce a data-driven model approximation method for nonlinear control systems, drawing on recent progress in machine learning and statistical dimensionality reduction. The method is based on embedding the nonlinear system in a high (or infinite) dimensional reproducing kernel Hilbert space (RKHS) where linear balanced truncation may be carried out implicitly. This leads to a nonlinear reduction map which can be combined with a representation of the system belonging to a RKHS to give a closed, reduced ord… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
55
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 51 publications
(55 citation statements)
references
References 41 publications
0
55
0
Order By: Relevance
“…In this section, we give a brief review of reproducing kernel Hilbert spaces (RKHSs) as used in statistical learning theory. The discussion here mostly follows previous studies; for a historical perspective, see also Aronszajn and Schönberg, and for a recent application related to dynamics, see Bouvrie and Hamzi . The key problem we want to study with this theory is to distinguish two different probability distributions.…”
Section: Kernels and MMDmentioning
confidence: 98%
“…In this section, we give a brief review of reproducing kernel Hilbert spaces (RKHSs) as used in statistical learning theory. The discussion here mostly follows previous studies; for a historical perspective, see also Aronszajn and Schönberg, and for a recent application related to dynamics, see Bouvrie and Hamzi . The key problem we want to study with this theory is to distinguish two different probability distributions.…”
Section: Kernels and MMDmentioning
confidence: 98%
“…Other choices are for example: covariance-weighted products for Gaussian-noise-driven systems yielding system covariances [76], reproducing kernel Hilbert spaces (RKHS) [77], such as the polynomial, Gaussian or Sigmoid kernels [78], or energy-stable inner products [79]. Also, weighted Gramians [36] and time-weighted system Gramians [80] can be computed using this interface, i.e., dp = @(x,y) mtimes([0:h:T].ˆk.…”
Section: Inner Product Interfacementioning
confidence: 99%
“…Therefore, cheaper numerical approximations using "adequate-fidelity" models are usually acceptable [2]. In this regard, reduced order modeling offers a viable technique to address systems characterized by underlying patterns [3][4][5][6][7][8][9][10][11][12]. This is especially true for fluid flows dominated by coherent structures (e.g., atmospheric and oceanic flows) [13][14][15][16][17][18][19][20][21].…”
Section: Introductionmentioning
confidence: 99%