2000
DOI: 10.1016/s0304-4149(99)00082-4
|View full text |Cite
|
Sign up to set email alerts
|

Geometric ergodicity of Metropolis algorithms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
177
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 161 publications
(182 citation statements)
references
References 17 publications
5
177
0
Order By: Relevance
“…These techniques are random variable generators that allow us to draw samples from complicated distributions. Perhaps the most commonly used MCMC algorithm is the Metropolis-Hastings (MH), which operates as follows [14]: from a given proposal distribution, we construct an irreducible Markov chain whose stationary distribution is the sought posterior law (i.e., samples generated by the algorithm after a suitable burn-in period are distributed according to desired posterior law). At each iteration t, a decision rule is applied to accept or reject the proposed sample given by the following acceptance probability:…”
Section: Introductionmentioning
confidence: 99%
“…These techniques are random variable generators that allow us to draw samples from complicated distributions. Perhaps the most commonly used MCMC algorithm is the Metropolis-Hastings (MH), which operates as follows [14]: from a given proposal distribution, we construct an irreducible Markov chain whose stationary distribution is the sought posterior law (i.e., samples generated by the algorithm after a suitable burn-in period are distributed according to desired posterior law). At each iteration t, a decision rule is applied to accept or reject the proposed sample given by the following acceptance probability:…”
Section: Introductionmentioning
confidence: 99%
“…The following lemma forms the basis for proving the validity of extended Gibbs sampling as an MCMC sampler. (For a discussion of ergodicity of MCMC samplers, see Roberts and Rosenthal 1999;Jarner and Hansen 2000. ) …”
Section: Extended Gibbs Samplingmentioning
confidence: 99%
“…Ergodicity results for a Markov chain constructed using the RWM algorithm also exist [21][22][23]. At least exponentially light tails are a necessity for π(x) for geometric ergodicity, which means that π(x)/e − x → c as x → ∞, for some constant, c. For super-exponential tails (where π(x) → 0 at a faster than the exponential rate), additional conditions are required [21,23].…”
Section: Random Walk Proposalsmentioning
confidence: 99%
“…At least exponentially light tails are a necessity for π(x) for geometric ergodicity, which means that π(x)/e − x → c as x → ∞, for some constant, c. For super-exponential tails (where π(x) → 0 at a faster than the exponential rate), additional conditions are required [21,23]. We demonstrate with a simple example why heavy-tailed forms of π(x) pose difficulties here (where π(x) → 0 at a rate slower than e − x ).…”
Section: Random Walk Proposalsmentioning
confidence: 99%