2019
DOI: 10.48550/arxiv.1906.01549
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Streaming Variational Monte Carlo

Abstract: Nonlinear state-space models are powerful tools to describe dynamical structures in complex time series. In a streaming setting where data are processed one sample at a time, simultaneously inferring the state and their nonlinear dynamics has posed significant challenges in practice. We develop a novel online learning framework, leveraging variational inference and sequential Monte Carlo, which enables flexible and accurate Bayesian joint filtering. Our method provides a filtering posterior arbitrarily close t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 40 publications
0
5
0
Order By: Relevance
“…The approach developed here can be thought of as a way to correct the approximate ELBOs computed in [16,44] in a principled manner, which takes into account the discrepancy between the filtering and approximate filtering distributions, and maintains the correct gradient dependencies in the computation graph. Finally [43] relies on the PF to do online variational inference. However the variational approximation of the filtering distribution is only implicit at its expression includes an intractable expectation and, like any other PF technique, its performance is expected to degrade significantly with the state dimension [6].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…The approach developed here can be thought of as a way to correct the approximate ELBOs computed in [16,44] in a principled manner, which takes into account the discrepancy between the filtering and approximate filtering distributions, and maintains the correct gradient dependencies in the computation graph. Finally [43] relies on the PF to do online variational inference. However the variational approximation of the filtering distribution is only implicit at its expression includes an intractable expectation and, like any other PF technique, its performance is expected to degrade significantly with the state dimension [6].…”
Section: Related Workmentioning
confidence: 99%
“…We next evaluate the performance of our algorithm for state estimation in non-linear, high-dimensional SSMs. We reproduce the Chaotic Recurrent Neural Network (CRNN) example in [43], but with state dimension d x = 5, 20, and 100. This non-linear model is an Euler approximation of the continuous-time recurrent neural network dynamics…”
Section: Chaotic Recurrent Neural Networkmentioning
confidence: 99%
See 2 more Smart Citations
“…In recent years, Variational Inference (VI) combined with Bayesian nonlinear filtering has been actively researched. In particular, VI based on Sequential Monte Carlo (SMC) methods [42,38,48,25,61,47,37,40] achieved tighter bound of the log marginal likelihood theoretically and experimentally. These methods, however, suffer from two drawbacks: particle degeneracy and biased gradient estimates.…”
Section: Introductionmentioning
confidence: 99%