2019
DOI: 10.1080/10618600.2019.1637747
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Deep Net GLM and GLMM

Abstract: Deep feedforward neural networks (DFNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a DFNN. The consideration of neural networks with random effects is not widely used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for high-dimensional Bayesian inference… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
50
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 54 publications
(50 citation statements)
references
References 41 publications
0
50
0
Order By: Relevance
“…Deep models have also been proposed for glms, which is a flexible technique for modeling data distributed according to exponential family distributions, and can model a large class of response variables, including real-valued, categorical or counts. Tran et al (2018) develop flexible versions for glms using the output of a feedforward neural network. With responses y and predictors X, a conventional glm models E(y | X) = g(Xβ), where β are the regression coefficients and g(·) is the link function.…”
Section: Horseshoe Shrinkage In Deep Modelsmentioning
confidence: 99%
See 2 more Smart Citations
“…Deep models have also been proposed for glms, which is a flexible technique for modeling data distributed according to exponential family distributions, and can model a large class of response variables, including real-valued, categorical or counts. Tran et al (2018) develop flexible versions for glms using the output of a feedforward neural network. With responses y and predictors X, a conventional glm models E(y | X) = g(Xβ), where β are the regression coefficients and g(·) is the link function.…”
Section: Horseshoe Shrinkage In Deep Modelsmentioning
confidence: 99%
“…Thus, the conditional mean of the responses is a linear function of X, transformed through the link function g. This linearity assumption is often restrictive and a natural way to introduce nonlinearity is by replacing X with the output of a multi-layer feedforward neural network that has X as input and consequently, whose output is a nonlinear function of X. Tran et al (2018) term this model DeepGLM. Similar to deep neural networks, a variational approximation to the log likelihood is used for training the model and global-local priors are used for inducing sparsity on β.…”
Section: Horseshoe Shrinkage In Deep Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…Chatzis, 2015;Chien and Ku, 2016;Gan et al, 2016;McDermott and Wikle, 2017a) but are quite sensitive to particular data sets and are typically computationally prohibitive. More recently, approximate Bayesian methods such as variational Bayes (Tran et al, 2018), and scalable Bayesian methods (Snoek et al, 2015) have been used successfully in deep models. In the context of DN-DSTMs this is still an active area of research.…”
Section: Combining the Dh-dstm And Dn-dstm Frameworkmentioning
confidence: 99%
“…The similarities and differences discussed above have helped establish a new branch in Statistical Science that looks at combining formal statistical models with the flexibility of deep ML models with a view to exploiting the strengths of both approaches in a single framework for better prediction and forecasting. For example, Nguyen et al (2019) use a type of RNN known as the long short-term memory (LSTM) within a classic stochastic volatility model in order to cater for long-range (temporal) dependence, while Tran et al (2019) (2019) used a deep structure to model nonstationarity in spatial models. The resulting model is interpretable, but the framework is firmly seated within the standard geostatistical setting where time, if considered, would be treated as an extra dimension; such models tend to be ill-suited for forecasting purposes.…”
Section: Introductionmentioning
confidence: 99%