2013
DOI: 10.1080/01621459.2013.829001
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Inference for Logistic Models Using Pólya–Gamma Latent Variables

Abstract: We propose a new data-augmentation strategy for fully Bayesian inference in models with binomial likelihoods. The approach appeals to a new class of Pólya-Gamma distributions, which are constructed in detail. A variety of examples are presented to show the versatility of the method, including logistic regression, negative binomial regression, nonlinear mixed-effects models, and spatial models for count data. In each case, our data-augmentation strategy leads to simple, effective methods for posterior inference… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
1,110
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
4
4

Relationship

1
7

Authors

Journals

citations
Cited by 771 publications
(1,112 citation statements)
references
References 29 publications
2
1,110
0
Order By: Relevance
“…With this simple data-augmentation trick from (Polson et al, 2013), the above model can be fit by Gibbs sampling, with the following conditionals. We let − denote all other variables being conditioned upon, and κ (s) = y (s) − m (s) /2.…”
Section: Further Details Of Bayesian Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…With this simple data-augmentation trick from (Polson et al, 2013), the above model can be fit by Gibbs sampling, with the following conditionals. We let − denote all other variables being conditioned upon, and κ (s) = y (s) − m (s) /2.…”
Section: Further Details Of Bayesian Methodsmentioning
confidence: 99%
“…We use the gammalasso because it also has a simple interpretation as a mixture of Laplace priors, making the connection with the graph fused lasso explicit, and because our experiments with other choices led to essentially equivalent performance. We represent the binomial likelihood using the Pólya-Gamma data augmentation scheme described in Polson et al (2013). This yields conditionally conjugate MCMC updates for all model parameters.…”
Section: Detailsmentioning
confidence: 99%
“…The logistic counterpart of the normal ogive model known as the Rasch model has recently been addressed by Polson et al (2013), who propose a Gibbs sampling procedure for logistic models analog to the data augmentation procedure. The latent responses follow a Polya-Gamma distribution, and if the prior distributions p(θ ) and p(b) are normal distributions, it follows that the posterior distributions of the person and item parameters can again be derived using normal linear model results.…”
Section: Data Augmentationmentioning
confidence: 99%
“…Separated clusters often result in distinct and more interpretable clusters and the number of clusters are smaller. Previous work has argued that discriminative clustering is desirable in various tasks, e.g., subspace selection analysis [De la Torre and Kanade, 2006;Ye et al, 2008], computer vision [Joulin et al, 2010], unsupervised regression [Krause et al, 2010] and factor modeling [Henao et al, 2014].…”
Section: Introductionmentioning
confidence: 99%
“…As the number of clusters is often unknown and growing over time, modern Bayesian nonparametric (BNP) clustering methods have the advantages of automatically identifying the suitable number of clusters -most popular models include the Dirichlet process mixture models (DPM) [Ferguson, 1973;Antoniak, 1974;Sethuraman, 1994;Ghahramani, 2012]. While DPM and models alike have been successful, they are strictly generative and the clustering outcomes favor the similarity of data points within a cluster, but do not explicitly model the discrimination among clusters.…”
Section: Introductionmentioning
confidence: 99%