2019
DOI: 10.1109/tsp.2019.2894825
|View full text |Cite
|
Sign up to set email alerts
|

Split-and-Augmented Gibbs Sampler—Application to Large-Scale Inference Problems

Abstract: This paper derives two new optimization-driven Monte Carlo algorithms inspired from variable splitting and data augmentation. In particular, the formulation of one of the proposed approaches is closely related to the alternating direction method of multipliers (ADMM) main steps. The proposed framework enables to derive faster and more efficient sampling schemes than the current state-of-theart methods and can embed the latter. By sampling efficiently the parameter to infer as well as the hyperparameters of the… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
67
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 38 publications
(68 citation statements)
references
References 49 publications
(96 reference statements)
1
67
0
Order By: Relevance
“…Thereby, the marginal distribution of x under πρ stands for an approximation of π. The corresponding approximation error, controlled by ρ, can be made arbitrarily small as depicted by Theorem 1 proven in [20].…”
Section: Bayesian Hierarchical Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…Thereby, the marginal distribution of x under πρ stands for an approximation of π. The corresponding approximation error, controlled by ρ, can be made arbitrarily small as depicted by Theorem 1 proven in [20].…”
Section: Bayesian Hierarchical Modelmentioning
confidence: 99%
“…Strong similarities with distributed optimization algorithms, e.g. the ADMM, are discussed in [20]. The Gibbs sampler derived to sample from the split distribution πρ is depicted in algo.…”
Section: Gibbs Samplermentioning
confidence: 99%
See 1 more Smart Citation
“…As stated in [17], SPA can be viewed as a "divide-and-conquer" approach where the initial sampling difficulty is divided in different easier sampling steps. Thus, the conditional distributions associated to β, u1 and u2 are Gaussian with diagonal covariance matrices for the last two distributions.…”
Section: Spa Algorithmmentioning
confidence: 99%
“…These sampling methods are special instances of Metropolis-Hastings algorithms based on the Langevin diffusion process which were improved in [16]. More recently, the connection between simulation-based algorithms and optimization has been strengthened by the so-called split-and-augmented Gibbs sampler (SPA) [17]. This algorithm stands for a general tool to conduct Bayesian inference that uses a "divide-and-conquer" strategy.…”
mentioning
confidence: 99%