2015
DOI: 10.1109/tnnls.2014.2311073
|View full text |Cite
|
Sign up to set email alerts
|

A Latent Manifold Markovian Dynamics Gaussian Process

Abstract: In this paper, we propose a Gaussian process (GP) model for analysis of nonlinear time series. Formulation of our model is based on the consideration that the observed data are functions of latent variables, with the associated mapping between observations and latent representations modeled through GP priors. In addition, to capture the temporal dynamics in the modeled data, we assume that subsequent latent representations depend on each other on the basis of a hidden Markov prior imposed over them. Derivation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 27 publications
0
6
0
Order By: Relevance
“…Overall, the results show that the accuracy of the proposed approach is insignificantly affected as the number of training data decreases. This is reasonable and expected, as it is known that GPs constitute an ideal modeling approach when dealing with sparse and/or limited training datasets [23,24].…”
Section: B Examining Model Accuracy For Different-sized Training Setsmentioning
confidence: 55%
“…Overall, the results show that the accuracy of the proposed approach is insignificantly affected as the number of training data decreases. This is reasonable and expected, as it is known that GPs constitute an ideal modeling approach when dealing with sparse and/or limited training datasets [23,24].…”
Section: B Examining Model Accuracy For Different-sized Training Setsmentioning
confidence: 55%
“…Besides, our sampling strategy does not induce significant computational costs, since we adopt the reparameterization (24) for the most part of the model training algorithm. In the future, we aim to consider how ACVI can cope with power-law distributions [33,34]; such a capacity is of importance to real-world natural language generation.…”
Section: Discussionmentioning
confidence: 99%
“…Given the training observations, Gaussian processes generate posterior probabilities of the target or output for new inputs or observations, as a function of the training data and the input kernel function [2]. Gaussian process models and its variants have been applied in a number of diverse fields such as model predictive control and system analysis [3]- [7], latent variable models [8]- [11], multi-task learning [10], [12], [13], image analysis and synthesis [14]- [17], speech processing [18]- [20], and magnetic resonance imaging (MRI) [21], [22]. Gaussian processes have also been extended to a non-stationary regression setting [23]- [25] and for regression over complex-valued data [26].…”
Section: Introductionmentioning
confidence: 99%