2019
DOI: 10.48550/arxiv.1903.05779
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Functional Variational Bayesian Neural Networks

Abstract: Variational Bayesian neural networks (BNNs) perform variational inference over weights, but it is difficult to specify meaningful priors and approximate posteriors in a high-dimensional weight space. We introduce functional variational Bayesian neural networks (fBNNs), which maximize an Evidence Lower BOund (ELBO) defined directly on stochastic processes, i.e. distributions over functions. We prove that the KL divergence between stochastic processes equals the supremum of marginal KL divergences over all finit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
39
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 36 publications
(39 citation statements)
references
References 11 publications
0
39
0
Order By: Relevance
“…However, most modern applications of BNNs still relied on simple Gaussian priors. Although a few different priors have been proposed for BNNs, these were mostly designed for specific tasks (Atanov et al, 2018;Ghosh & Doshi-Velez, 2017;Overweg et al, 2019;Nalisnick, 2018;Cui et al, 2020;Hafner et al, 2020) or relied heavily on non-standard inference methods (Sun et al, 2019;Ma et al, 2019;Karaletsos & Bui, 2020;Pearce et al, 2020). Moreover, while many interesting distributions have been proposed as variational posteriors for BNNs (Louizos & Welling, 2017;Swiatkowski et al, 2020;Dusenberry et al, 2020;Aitchison et al, 2020), these approaches have still used Gaussian priors.…”
Section: Related Workmentioning
confidence: 99%
“…However, most modern applications of BNNs still relied on simple Gaussian priors. Although a few different priors have been proposed for BNNs, these were mostly designed for specific tasks (Atanov et al, 2018;Ghosh & Doshi-Velez, 2017;Overweg et al, 2019;Nalisnick, 2018;Cui et al, 2020;Hafner et al, 2020) or relied heavily on non-standard inference methods (Sun et al, 2019;Ma et al, 2019;Karaletsos & Bui, 2020;Pearce et al, 2020). Moreover, while many interesting distributions have been proposed as variational posteriors for BNNs (Louizos & Welling, 2017;Swiatkowski et al, 2020;Dusenberry et al, 2020;Aitchison et al, 2020), these approaches have still used Gaussian priors.…”
Section: Related Workmentioning
confidence: 99%
“…Bayesian methods make assumptions on prior and loss functions. However, these assumptions are not always fulfilled (Blundell et al, 2015;Iwata & Ghahramani, 2017;Kuleshov et al, 2018;Lakshminarayanan et al, 2016;Sun et al, 2019). Multiple observation noise models and normalizing flow (Durkan et al, 2019;Gopal & Key, 2021) can generalize to any noise distribution, but it is left to human's expertise to choose appropriate likelihood function.…”
Section: Related Workmentioning
confidence: 99%
“…We intend to use a data-driven regression method to determine this probability distribution and verify our presumption. For this aim, we resort to B-DNN [10], [11], [12], [13] which applies Bayesian statistics to DNN weights inference. By further merging B-DNN with Gaussian process [20], [21], the price change distribution is simplified as a gaussian distribution.…”
Section: Effect Of the Principal Investor's Decisionmentioning
confidence: 99%
“…There are two fine-tuned hyperparameters σ and λ for optimizing our model's performance. Since σ has a significant influence on posterior predictive variance as shown in Equation (12), the optimal value of σ can be found out by the "68-95-99.7 rule". We carry out Bayesian inference and regression analysis with two sets of parameters (σ = 2.5, λ = 0.7, and σ = 1, λ = 0.1) and exhibits the results in turn by Figure 2 and Figure 3.…”
Section: Effect Of the Principal Investor's Decisionmentioning
confidence: 99%
See 1 more Smart Citation