2011
DOI: 10.1109/tnn.2011.2171361
|View full text |Cite
|
Sign up to set email alerts
|

Reduced-Size Kernel Models for Nonlinear Hybrid System Identification

Abstract: Abstract-The paper focuses on the identification of nonlinear hybrid dynamical systems, i.e., systems switching between multiple nonlinear dynamical behaviors. Thus the aim is to learn an ensemble of submodels from a single set of inputoutput data in a regression setting with no prior knowledge on the grouping of the data points into similar behaviors. To be able to approximate arbitrary nonlinearities, kernel submodels are considered. However, in order to maintain efficiency when applying the method to large … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 27 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…The convergence of the randomized sample-and-evaluate procedure proposed to solve the MSS problem 7 within the continuous framework described in Section 3 has been shown in [6]. In particular, when P γ is sufficiently close to P γ in ( 9), then the sign of the gradients in ( 19) provide a reliable information for tuning the mode extraction and regression inclusion probabilities towards P γ by the iterative application of the update rules in (20). Practical experiences show that the algorithm converges even when it is iterated with P γ randomly initialized.…”
Section: Algorithm Convergencementioning
confidence: 99%
See 1 more Smart Citation
“…The convergence of the randomized sample-and-evaluate procedure proposed to solve the MSS problem 7 within the continuous framework described in Section 3 has been shown in [6]. In particular, when P γ is sufficiently close to P γ in ( 9), then the sign of the gradients in ( 19) provide a reliable information for tuning the mode extraction and regression inclusion probabilities towards P γ by the iterative application of the update rules in (20). Practical experiences show that the algorithm converges even when it is iterated with P γ randomly initialized.…”
Section: Algorithm Convergencementioning
confidence: 99%
“…Some of these works have also been extended to the case of nonlinear modes, by resorting to the Nonlinear AutoRegressive with eXogenous input (NARX) modeling framework [22], [23]. For example, a framework based on kernel functional expansions to represent the nonlinear functions and on the minimization of a cost function involving only the continuous parameters of the model as variables is introduced in [19] (and later extended in [18], [20]). In [16], the authors propose an extension of the sum-of-norms approach described in [29] to piecewise systems with nonlinear dynamics, based again on kernel functional expansions.…”
Section: Introductionmentioning
confidence: 99%
“…Theorem 3.3. Let P γ be the probability distribution over Λ defined according to (19), which depends on η in (16) and µ in (17). Then there exists ∈ (0, 1), such that if P γ (λ ) ≥ the iterative application of (21) and (23) will make P γ converge to the target limit distribution P γ .…”
Section: Tuning Of P γmentioning
confidence: 99%
“…A reformulation of the optimization problem in a continuous framework is studied in [17] and [18], thus allowing the use of efficient solvers and enabling the solution of larger problems. The efficiency of this method is further improved in [19] by introducing fixed-size kernel submodels. In [15], the authors proposed an extension of the sum-of-norms approach described in [27] to piecewise systems with nonlinear dynamics, based again on kernel functional expansions.…”
Section: Introductionmentioning
confidence: 99%
“…Various approaches have been introduced in the recent literature to address this task (see, e.g., [12], [4], [6]). Comparatively fewer works address the case with nonlinear local models, and typically in a non-parametric setting using kernel functional expansions (see, e.g., [1], [7], [8], [9]). Employing a parametric framework can lead to more compact and interpretable models, but adds another dimension to the problem.…”
Section: Introductionmentioning
confidence: 99%