2019
DOI: 10.1080/10618600.2019.1598872
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive Incremental Mixture Markov Chain Monte Carlo

Abstract: We propose Adaptive Incremental Mixture Markov chain Monte Carlo (AIMM), a novel approach to sample from challenging probability distributions defined on a general state-space. While adaptive MCMC methods usually update a parametric proposal kernel with a global rule, AIMM locally adapts a semiparametric kernel. AIMM is based on an independent Metropolis-Hastings proposal distribution which takes the form of a finite mixture of Gaussian distributions. Central to this approach is the idea that the proposal dist… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 42 publications
0
7
0
Order By: Relevance
“…However, if parameters are interdependent, MCMC algorithm samples manifolds of alternative solutions, resulting in large standard errors of the overfitted parameters. We successfully addressed this issue by a modified version of Maire’s algorithm [ 56 ]. Central to this approach is the idea that the proposal distribution adapts to the target by locally adding a mixture component when the discrepancy between the proposal mixture and the target is deemed to be too large.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…However, if parameters are interdependent, MCMC algorithm samples manifolds of alternative solutions, resulting in large standard errors of the overfitted parameters. We successfully addressed this issue by a modified version of Maire’s algorithm [ 56 ]. Central to this approach is the idea that the proposal distribution adapts to the target by locally adding a mixture component when the discrepancy between the proposal mixture and the target is deemed to be too large.…”
Section: Discussionmentioning
confidence: 99%
“…Here the proposal distribution is learned along the process using the full information cumulated so far. We implemented the adaptive MCMC version with Gaussian proposal distribution described in [ 60 ] as well as adaptive incremental Mixture MCMC [ 56 ] called AIMM, which we modified slightly. Strictly speaking, these methods are not really Markov chains, because proposal distribution of the next step depends on all preceding states rather than only the previous one.…”
Section: Figure A1mentioning
confidence: 99%
See 1 more Smart Citation
“…Nevertheless, we have been able to analyze certain behaviors of proposals based on kernel density estimators. A version of this procedure (to use kernel density estimators as a proposal in independent Metropolis-Hastings) was previously pursued in Maire et al (2019); however, they do not appear to have looked at the question of when the addition of a new component into the mixture is beneficial, which is the topic under consideration in the sequel.…”
Section: H Adaptation Of a Kernel Density Proposal Distributionmentioning
confidence: 99%
“…In view of the shortcomings of previous works, this paper proposes a downhole track line detection model based on CGAN. We use the method of adversarial learning to solve the problem of artificially designing complex loss functions and introduce Monte Carlo search [14] technology into the generator network. Monte Carlo searches have been widely used in text generation tasks.…”
Section: Introductionmentioning
confidence: 99%