Bayesian Time Series Models 2011
DOI: 10.1017/cbo9780511984679.013
|View full text |Cite
|
Sign up to set email alerts
|

Sequential inference for dynamically evolving groups of objects

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
2
2

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 12 publications
0
5
0
Order By: Relevance
“…A standard (variable rate) particle filter can be used for inference but is not efficient as the proposal in (4) involves multiple latent variables in a single propagation. We address this high-dimensional proposal with the SMCMC algorithm where samples are obtained with local or global MCMC moves followed by a Metropolis-Hastings (MH) accept-reject step as bias correction [13] [14]. Furthermore, a mixture sampling procedure is adopted: at each MCMC iteration, a decision is made on performing either a joint MH proposal step with probability P J or a sequence of individual refinement Metropolis-within-Gibbs transitions with probability 1 − P J .…”
Section: Sequential Inferencementioning
confidence: 99%
“…A standard (variable rate) particle filter can be used for inference but is not efficient as the proposal in (4) involves multiple latent variables in a single propagation. We address this high-dimensional proposal with the SMCMC algorithm where samples are obtained with local or global MCMC moves followed by a Metropolis-Hastings (MH) accept-reject step as bias correction [13] [14]. Furthermore, a mixture sampling procedure is adopted: at each MCMC iteration, a decision is made on performing either a joint MH proposal step with probability P J or a sequence of individual refinement Metropolis-within-Gibbs transitions with probability 1 − P J .…”
Section: Sequential Inferencementioning
confidence: 99%
“…Instead, we address this high-dimensional proposal with a SMCMC algorithm which targets sequentially the joint distributions of Eq. (17), using both local and global Metropolis-Hastings (MH) accept-reject moves instead of importance sampling or resampling [20], [14], [21], [22].…”
Section: Sequential Inferencementioning
confidence: 99%
“…Reversible jump MCMC methods must be used to handle the varying number of notes. See [7] for details of similar schemes for tracking and finance applications, [10] for general information about MCMC, and [11] for details of the reversible jump MCMC framework. Our implementation proceeds as shown in Algorithm 1.…”
Section: Sequential Mcmcmentioning
confidence: 99%
“…A dynamical model for these parameters over time frames then completes a Bayesian spatio-temporal state space model. Inference in this model is carried out using a specially modified version of the sequential MCMC algorithm [7], in which information about the previous frame is collapsed onto a single univariate marginal representation of the multipitch estimation.…”
Section: Introductionmentioning
confidence: 99%