2014
DOI: 10.3182/20140824-6-za-1003.01843
|View full text |Cite
|
Sign up to set email alerts
|

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Abstract: Gaussian process state-space models (GP-SSMs) are a very flexible family of models of nonlinear dynamical systems. They comprise a Bayesian nonparametric representation of the dynamics of the system and additional (hyper-)parameters governing the properties of this nonparametric representation. The Bayesian formalism enables systematic reasoning about the uncertainty in the system dynamics. We present an approach to maximum likelihood identification of the parameters in GP-SSMs, while retaining the full nonpar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(39 citation statements)
references
References 19 publications
0
39
0
Order By: Relevance
“…Finally, deriving a bound for the more advanced GP-based model as given in [2] and [28] would be very interesting as our future work.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally, deriving a bound for the more advanced GP-based model as given in [2] and [28] would be very interesting as our future work.…”
Section: Discussionmentioning
confidence: 99%
“…The study of Gaussian process (GP)-based state-space models has been emerging in recent years. Dynamic models for the state formulated as Gaussian processes are introduced in [2]. There are also studies on Gaussian process-based measurement models, such as [3] and [4].…”
Section: A Motivations and Backgroundmentioning
confidence: 99%
See 1 more Smart Citation
“…Under mild regularity conditions, these iterations yield a sequence of parameter estimates that converges to a stationary point of the marginal likelihood of the data (under the condition that the number of particles M k at iteration k is such that that ∞ k=1 M −1 k = ∞; see [32]). Then, using the estimated hyperparameters, we can run a new Gibbs sampler and approximate the integrals in (17) with averages over the samples:…”
Section: Update the Parameters According To Corollarymentioning
confidence: 99%
“…In practice, the ancestor sampling in the CSMC procedure gives rise to a considerable improvement over PG, comparable to that of backward simulation. The method has been successfully applied to challenging inference problems, such as Wiener system identification (Lindsten et al, 2013b) and learning of nonparametric, nonlinear SSMs (Frigola et al, 2013). We illustrate the PGAS method in the following example.…”
Section: Particle Gibbs With Ancestor Samplingmentioning
confidence: 99%