2017
DOI: 10.1109/tac.2016.2582642
|View full text |Cite
|
Sign up to set email alerts
|

Maximum Entropy Kernels for System Identification

Abstract: A new nonparametric approach for system identification has been recently proposed where the impulse response is modeled as the realization of a zero-mean Gaussian process whose covariance (kernel) has to be estimated from data. In this scheme, quality of the estimates crucially depends on the parametrization of the covariance of the Gaussian process. A family of kernels that have been shown to be particularly effective in the system identification framework is the family of Diagonal/Correlated (DC) kernels. Ma… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
36
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 57 publications
(37 citation statements)
references
References 22 publications
1
36
0
Order By: Relevance
“…In particular, we derive a new orthonormal basis expansion of the DC kernel and the explicit expression of the norm of the RKHS associated with the DC kernel. Moreover, for the non-uniformly sampled DC kernel, we derive its maximum entropy property and show that its kernel matrix has tridiagonal inverse, which extend the corresponding results in [17], [18]. DRAFT…”
supporting
confidence: 69%
See 1 more Smart Citation
“…In particular, we derive a new orthonormal basis expansion of the DC kernel and the explicit expression of the norm of the RKHS associated with the DC kernel. Moreover, for the non-uniformly sampled DC kernel, we derive its maximum entropy property and show that its kernel matrix has tridiagonal inverse, which extend the corresponding results in [17], [18]. DRAFT…”
supporting
confidence: 69%
“…Proposition 5.2 leads to an interesting result that, the kernel matrix of the DC kernel (6c) with t, s ∈ T has tridiagonal inverse, which is an extension of the result of [17] from the TC kernel (6b) to the DC kernel (6c) and an extension of the result of [18] from the uniformly sampled case to the non-uniformly sampled case.…”
Section: Proposition 52mentioning
confidence: 99%
“…This type of Gaussian priors can be derived following Maximum Entropy arguments, see e.g. [13], [14]. The minimum variance estimate of the impulse response is then given by:…”
Section: A Point Estimatormentioning
confidence: 99%
“…These approximations yield to different approaches, such the so-called Empirical Bayes (EB) and Full Bayes (FB) estimators. Remark 1: In principle, the estimator (14) belongs to an infinite-dimensional space. However, for computational reasons, it is general practice to estimate a finite-length impulse response, whose length n is chosen large enough to capture the dynamics of the estimated system.…”
Section: A Point Estimatormentioning
confidence: 99%
“…The kernel design plays a similar role as the model structure design for ML/PEM and determines the underlying model structure for KRM. In the past few years, many efforts have been spent on this issue and several kernels have been invented to embed various types of prior knowledge, e.g., Carli et al (2017); Chen et al (2014Chen et al ( , 2012; Dinuzzo (2015); Marconato et al (2016); Pillonetto et al (2016Pillonetto et al ( , 2011; Pillonetto & De Nicolao (2010); Zorzi & Chiuso (2017). In particular, two systematic kernel design methods (one is from a machine learning perspective and the other one is from a system theory perspective) were developed in by embedding the corresponding type of prior knowledge.…”
Section: Introductionmentioning
confidence: 99%