2018
DOI: 10.1214/18-ejs1481
|View full text |Cite
|
Sign up to set email alerts
|

Geometric ergodicity of Pólya-Gamma Gibbs sampler for Bayesian logistic regression with a flat prior

Abstract: The Logistic regression model is the most popular model for analyzing binary data. In the absence of any prior information, an improper flat prior is often used for the regression coefficients in Bayesian logistic regression models. The resulting intractable posterior density can be explored by running Polson et al.'s (2013) data augmentation (DA) algorithm. In this paper, we establish that the Markov chain underlying Polson et al.'s (2013) DA algorithm is geometrically ergodic. Proving this theoretical resul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 21 publications
(15 citation statements)
references
References 19 publications
0
15
0
Order By: Relevance
“…Availability of a Markov chain CLT has been demonstrated for myriad MCMC algorithms for common statistical models. Here we provide an incomplete list: linear models Hobert 2012, 2015), generalized linear models including the probit model (Roy andHobert 2007, Chakraborty andKhare 2017), the popular logistic model (Choi andHobert 2013, Wang andRoy 2018c) and the robit model (Roy 2012), generalized linear mixed models including the probit mixed model (Wang and Roy 2018b), and the logistic mixed model (Wang and Roy 2018a), quantile regression models (Khare and Hobert 2012), multivariate regression models (Roy andHobert 2010, Hobert et al 2018), penalized regression and variable selection models (Khare and Hobert 2013, Roy and Chakraborty 2017, Vats 2017. So far we have described the honest MCMC in the context of estimating means of univariate functions.…”
Section: Honest Mcmcmentioning
confidence: 99%
“…Availability of a Markov chain CLT has been demonstrated for myriad MCMC algorithms for common statistical models. Here we provide an incomplete list: linear models Hobert 2012, 2015), generalized linear models including the probit model (Roy andHobert 2007, Chakraborty andKhare 2017), the popular logistic model (Choi andHobert 2013, Wang andRoy 2018c) and the robit model (Roy 2012), generalized linear mixed models including the probit mixed model (Wang and Roy 2018b), and the logistic mixed model (Wang and Roy 2018a), quantile regression models (Khare and Hobert 2012), multivariate regression models (Roy andHobert 2010, Hobert et al 2018), penalized regression and variable selection models (Khare and Hobert 2013, Roy and Chakraborty 2017, Vats 2017. So far we have described the honest MCMC in the context of estimating means of univariate functions.…”
Section: Honest Mcmcmentioning
confidence: 99%
“…In particular, Choi and Hobert [11] show that the parent PG DA chain is uniformly ergodic (using the marginal β (t) chain) if proper priors are used. Following this work, Wang and Roy [56] show the geometric ergodicity of the PG DA chain with flat priors. The uniform ergodicity proof for the parent DA chain in Choi and Hobert [11] for the binary and proper prior setting is based on a minorization argument on the marginal {β (t) } chain; however, this proof strategy fails for ADDA because the {β (t) } process for Φ ADP G is not Markov.…”
Section: Adda For Bayesian Logistic Regressionmentioning
confidence: 80%
“…The marginal Markov chain {β (t) } of the β draws collected in step PG.2 has the posterior distribution of β in (1) as its invariant distribution [11,56,55].…”
Section: Adda For Bayesian Logistic Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous work on the theoretical properties of PG Gibbs samplers includes Choi and Hobert (2013), Choi and Romàn (2017), and Wang and Roy (2018b). Choi and Hobert (2013) analyzed the PG Gibbs sampler under the same normal prior on β and showed that it is uniformly ergodic.…”
Section: Introductionmentioning
confidence: 99%