2022
DOI: 10.1007/s11222-022-10137-8
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive random neighbourhood informed Markov chain Monte Carlo for high-dimensional Bayesian variable selection

Abstract: We introduce a framework for efficient Markov chain Monte Carlo algorithms targeting discrete-valued high-dimensional distributions, such as posterior distributions in Bayesian variable selection problems. We show that many recently introduced algorithms, such as the locally informed sampler of Zanella (J Am Stat Assoc 115(530):852–865, 2020), the locally informed with thresholded proposal of Zhou et al. (Dimension-free mixing for high-dimensional Bayesian variable selection, 2021) and the adaptively scaled in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(13 citation statements)
references
References 60 publications
0
13
0
Order By: Relevance
“…Some recently developed algorithms for high-dimensional variable selection use an independent approximation to the posterior distribution. In MCMC methods, the ASI algorithm , and its development to ARNI and PARNI (Liang et al, 2022) use an independent approximation. Similarly, Ray and Szabó (2022) study variational inference for Bayesian variable selection where the variational distribution is independent across the variables.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Some recently developed algorithms for high-dimensional variable selection use an independent approximation to the posterior distribution. In MCMC methods, the ASI algorithm , and its development to ARNI and PARNI (Liang et al, 2022) use an independent approximation. Similarly, Ray and Szabó (2022) study variational inference for Bayesian variable selection where the variational distribution is independent across the variables.…”
Section: Discussionmentioning
confidence: 99%
“…, γ (N ) from the posterior distribution p(γ|Data). Unlike the first approach, this leads to unbiased estimates of posterior quantities but it can be computationally expensive to generate a representative sample (Zanella and Roberts, 2019;Zhou et al, 2022;Liang et al, 2022).…”
Section: Bayesian Variable Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…ASI is able to adapt to the importance of each candidate covariate and propose multiple swaps using MCMC methods with the construction of adaptive random neighbourhood informed proposals. In a more recent paper, Liang et al (2022) thoroughly analysed the ASI algorithm, showing that ASI is a random neighbourhood sampler with a second stage that is a random walk proposal. Building on this, Liang et al (2022) introduced a new algorithm that uses an informed withinneighbourhood proposal in the second stage.…”
Section: Notation Basic Formulas and A Brief Review Of Search Methodsmentioning
confidence: 99%
“…The tuning parameter ζ ∈ (ϵ, 1 − ϵ) denotes the non-informative jumping probability. Two different methods for adapting ζ are provided in [32]. The function g is a balancing function that satisfies g(x) = xg(1/x).…”
Section: The Parni Proposalmentioning
confidence: 99%