2016
DOI: 10.1109/tsp.2015.2504342
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Sequential Monte-Carlo Samplers for Bayesian Inference

Abstract: International audienceIn many problems, complex non-Gaussian and/or nonlinear models are required to accurately describe a physical system of interest. In such cases, Monte Carlo algorithms are remarkably flexible and extremely powerful approaches to solve such inference problems. However, in the presence of a high-dimensional and/or multimodal posterior distribution, it is widely documented that standard Monte-Carlo techniques could lead to poor performance. In this paper, the study is focused on a Sequential… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
54
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 41 publications
(54 citation statements)
references
References 34 publications
0
54
0
Order By: Relevance
“…Before we introduce the SMC sampler algorithm for gene expression decomposition, we will succinctly describe the general principle of SMC samplers in Bayesian inference settings [ 28 30 ]. Denote the prior distribution, the likelihood function and the posterior distribution in a Bayesian inference setup as p ( θ ), p ( Y | θ ) and p ( θ | Y ), respectively.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…Before we introduce the SMC sampler algorithm for gene expression decomposition, we will succinctly describe the general principle of SMC samplers in Bayesian inference settings [ 28 30 ]. Denote the prior distribution, the likelihood function and the posterior distribution in a Bayesian inference setup as p ( θ ), p ( Y | θ ) and p ( θ | Y ), respectively.…”
Section: Methodsmentioning
confidence: 99%
“…Using the Bayes rule, the posterior distribution can be written as a function of the prior distribution and the likelihood function as follows: where Z = ∫ Θ p ( θ ) p ( Y | θ ) d θ , a constant with respect to θ , is referred to as the evidence. With SMC samplers, rather than sampling from the posterior distribution p ( θ | Y ) in (5) , a sequence of intermediate target distributions, , are designed, that transitions smoothly from the prior distribution, i.e., π 1 = p ( θ ), which is usually easier to sample from, and gradually introduce the effect of the likelihood so that in the end, we have π T = p ( θ | Y ) which is the posterior distribution of interest [ 28 , 29 ]. For such sequence of intermediate distributions, a natural choice is the likelihood tempered target sequence [ 28 , 41 ]: where is a non-decreasing temperature schedule with ϵ 1 = 0 and ϵ T = 1, is the unnormalized target distribution and is the evidence at time t .…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…However, if more VAFs are available for newly called SNV(s), both algorithms have to be restarted in order to incorporate the newly called SNV(s). Moreover, MCMC approach in general as previously shown in (Nguyen et al, 2016;Jasra et al, 2007), is plagued with some inherent issues which often limit its performance: (i) sometimes, it is difficult to assess when the Markov chain has reached its stationary regime of interest (ii) requirement of burn-in period and thinning interval, and most importantly, (iii) if the target distribution is highly multi-modal, MCMC algorithms can easily become trapped in local modes.…”
Section: Introductionmentioning
confidence: 99%