2015
DOI: 10.1007/s11222-015-9612-3
|View full text |Cite
|
Sign up to set email alerts
|

Sequential Monte Carlo methods for mixtures with normalized random measures with independent increments priors

Abstract: Normalized random measures with independent increments are a general, tractable class of nonparametric prior. This paper describes sequential Monte Carlo methods for both conjugate and non-conjugate nonparametric mixture models with these priors. A simulation study is used to compare the efficiency of the different algorithms for density estimation and comparisons made with Markov chain Monte Carlo methods. The SMC methods are further illustrated by applications to dynamically fitting a nonparametric stochasti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 50 publications
(67 reference statements)
0
7
0
Order By: Relevance
“…After the reweighting step, some particles will have negligible weights and so a resampling step replaces the weighted sample of particles with an unweighted set by removing particles with low weights and replacing with those with relatively high weights. Various resampling techniques exist (see Hol et al 2006), of which we implement systematic resampling as it has been found to offer significantly improved mixing over other methods and decrease the level of path degeneracy (Chopin and Singh 2015;Griffin 2014). This allows for exploitation of the information gained by these higher weight particles by concentrating computational resources on them.…”
Section: Particlemdimentioning
confidence: 99%
See 2 more Smart Citations
“…After the reweighting step, some particles will have negligible weights and so a resampling step replaces the weighted sample of particles with an unweighted set by removing particles with low weights and replacing with those with relatively high weights. Various resampling techniques exist (see Hol et al 2006), of which we implement systematic resampling as it has been found to offer significantly improved mixing over other methods and decrease the level of path degeneracy (Chopin and Singh 2015;Griffin 2014). This allows for exploitation of the information gained by these higher weight particles by concentrating computational resources on them.…”
Section: Particlemdimentioning
confidence: 99%
“…The task of cluster analysis, however, is not typically one viewed as evolving over time and so would not appear suitable for sequential methods. Nevertheless, particle filter methods have been successfully applied to the task (see, e.g., Chopin 2002;Fearnhead 2004;Griffin 2014;Bouchard-Côté et al 2017, ). The approach involves treating the observation index, 1:n, as an artificial time index.…”
Section: Particlemdimentioning
confidence: 99%
See 1 more Smart Citation
“…particleMDI extends the original MDI algorithm, replacing the one-at-a-time approach to clustering with a conditional particle filter, which has demonstrated good mixing properties even when the number of particles is relatively low [8]. This approach to cluster analysis [see 4, 5, 8] infers a latent cluster allocation, c i,k , for an observation, x i,k , given observations x 1:i,k and allocations c 1:(i−1),k , using a weighted cloud of approximations, termed particles.…”
Section: Particlemdimentioning
confidence: 99%
“…As well as giving the algorithm a 'warm start', this also avoids introducing a dependency between the inferred allocations and the order data are observed. Other approaches, such as that in [8], resolve this issue by instead updating all previous allocations during the resampling step. In a worst-case scenario-where resampling is performed at every step-this would increase the complexity of the algorithm from O(n) to O(n 2 ), assuming the mutation weights can be computed in constant time.…”
Section: Inputsmentioning
confidence: 99%