2018 IEEE Statistical Signal Processing Workshop (SSP) 2018
DOI: 10.1109/ssp.2018.8450772
|View full text |Cite
|
Sign up to set email alerts
|

Sequential MCMC With The Discrete Bouncy Particle Sampler

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(11 citation statements)
references
References 18 publications
0
11
0
Order By: Relevance
“…At the same time, it cannot affect the better performance of the DZZ method. Like in [31], We employ Γ to obtain the 𝑀. Γ follows the equation ( 20)…”
Section: Proposal Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…At the same time, it cannot affect the better performance of the DZZ method. Like in [31], We employ Γ to obtain the 𝑀. Γ follows the equation ( 20)…”
Section: Proposal Methodsmentioning
confidence: 99%
“…However, the block MH-within Gibbs sampling is employed as the refinement step in the traditional Composite MH Kernel and it will result in extremely low sampling rate. In order to improve the efficient, HMC and DBPS [31] is proposed as the refinement step to construct 𝑞 𝑡 ,3 (•) in the Composite MH Kernel and these methods are become one class of the most effective methods.…”
Section: Problem Statementmentioning
confidence: 99%
See 1 more Smart Citation
“…The literature on MCMC algorithms provides numerous ways in which to construct the Markov kernels K μ n . For instance, we could use Metropolis-Hastings (MH) [6], the Metropolisadjusted Langevin algorithm, Hamiltonian Monte Carlo and hybrid kernels [31], [32], or kernels based on invertible particle flow ideas [22] or on the bouncy particle sampler [28]. As an illustration, Example 1 describes a simple independent MH kernel with a proposal distribution tailored to our setting.…”
Section: Generic Mcmc-pf Algorithmmentioning
confidence: 99%
“…An alternative to these schemes in which the particles at each time step are sampled instead according to a single Markov chain Monte Carlo (MCMC) chain was proposed early on by [6]. Over recent years, there has been a renewed interest in such ideas as there is empirical evidence that these methods can outperform standard SMC algorithms in interesting scenarios (see, e.g., [8], [15], [28] , [31], and [32] for novel applications and extensions). These methods have been termed sequential MCMC methods in the literature.…”
Section: Introductionmentioning
confidence: 99%