2014
DOI: 10.1007/s11222-014-9480-2
|View full text |Cite
|
Sign up to set email alerts
|

Variational approximations in geoadditive latent Gaussian regression: mean and quantile regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…We applied four different specification/estimation methods: FE refers to the fixed effects panel quantile regression (Koenker ) with optimal shrinkage estimated following Lamarche (); CRE is the correlated random effects estimator of Abrevaya and Dahl () amended following Bache et al () to allow for unbalanced panels; MCMC is a Bayesian formulation with country random effects with Dirichlet process priors, estimated by the modified Markov Chain Monte Carlo algorithm of Waldmann et al (). This version has been chosen because of improved convergence; VA is a computational simplification of the Bayesian formulation with more restrictive Gaussian priors on the random effects and estimated using the variational approximation method of Waldmann and Kneib (). This provides a more restrictive model, but considerably reduces the computational costs. …”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…We applied four different specification/estimation methods: FE refers to the fixed effects panel quantile regression (Koenker ) with optimal shrinkage estimated following Lamarche (); CRE is the correlated random effects estimator of Abrevaya and Dahl () amended following Bache et al () to allow for unbalanced panels; MCMC is a Bayesian formulation with country random effects with Dirichlet process priors, estimated by the modified Markov Chain Monte Carlo algorithm of Waldmann et al (). This version has been chosen because of improved convergence; VA is a computational simplification of the Bayesian formulation with more restrictive Gaussian priors on the random effects and estimated using the variational approximation method of Waldmann and Kneib (). This provides a more restrictive model, but considerably reduces the computational costs. …”
Section: Resultsmentioning
confidence: 99%
“…Although by far the most popular, the Koenker () estimator is the most demanding computationally in that the search for optimal amount of shrinkage carries considerable computational costs. Bayesian estimation via Markov Chain Monte Carlo (MCMC) simulation is always computationally demanding, but workable approximations, such as variational inference (Waldmann and Kneib ), or (integrated nested) Laplace approximation (Yue and Rue ) are available to reduce the computational costs. Finally, the CRE approach is computationally the most appealing, although it requires practical examination and justification of the identification assumptions and hence may not always be applicable to the specific estimation problem.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Under the topic of other inferential approaches, I would particularly be interested in hearing Simon's opinion on the ability of variational approximations for estimating complex generalized additive models. Waldmann and Kneib (2015) have utilized these for inference in Gaussian mean regression and quantile regression (again based on the working likelihood of the asymmetric Laplace distribution) where similar schemes as with Gibbs sampling in Markov chain Monte Carlo simulations can be derived. On the other hand, they also found that uncertainty quantification tends to be complicated using simple forms of variational approximations.…”
mentioning
confidence: 99%