2016
DOI: 10.1609/aaai.v30i1.10296
|View full text |Cite
|
Sign up to set email alerts
|

Assumed Density Filtering Methods for Learning Bayesian Neural Networks

Abstract: Buoyed by the success of deep multilayer neural networks, there is renewed interest in scalable learning of Bayesian neural networks. Here, we study algorithms that utilize recent advances in Bayesian inference to efficiently learn distributions over network weights. In particular, we focus on recently proposed assumed density filtering based methods for learning Bayesian neural networks -- Expectation and Probabilistic backpropagation. Apart from scaling to large datasets, these techniques seamlessly deal wit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(8 citation statements)
references
References 13 publications
0
8
0
Order By: Relevance
“…Variational inference for BNNs is well-represented in the literature [9,10,12,19,27,28] with extensions to classification tasks [29] and non-normal prior or variational posterior distributions [7,12,30]. Throughout the experiments of Section 5 we validate our pruning approach for two of the most common variational inference algorithms: Bayes-bybackprop [12] and variance backpropagation [10].…”
Section: Probabilistic Inferencementioning
confidence: 84%
“…Variational inference for BNNs is well-represented in the literature [9,10,12,19,27,28] with extensions to classification tasks [29] and non-normal prior or variational posterior distributions [7,12,30]. Throughout the experiments of Section 5 we validate our pruning approach for two of the most common variational inference algorithms: Bayes-bybackprop [12] and variance backpropagation [10].…”
Section: Probabilistic Inferencementioning
confidence: 84%
“…Using a binary scheme as a conditioning method assumes that there is a fixed amount of distribution to the model. In an uncontrolled environment, this is a potentially false assumption as unknown variables could introduce noise to the signal that would make it difficult to distinguish the ground truth close to the bin edges [37,38]. This is potentially further exacerbated when processing thermal video from cameras with a relative internal thermograph.…”
Section: Methodsmentioning
confidence: 99%
“…Additionally, methods for propagating aleatoric uncertainty from the input to the output of the neural network can be classified into two main groups: layer-wise and entirenetwork uncertainty propagation (Abdelaziz et al, 2015). Though layer-wise uncertainty propagation methods (Ghosh et al, 2016); Hernández-Lobato and Adams, 2015); Gast and Roth, 2018); Wang et al, 2016); Astudillo and Neto, 2011) can offer the distributions of hidden layers, they often require modification to the original network during the training or inference phases. Moreover, Abdelaziz et al (2015); Chua et al (2018) demonstrate that entire-network uncertainty propagation through particle-based propagation methods such as the Unscented Transform Julier and Uhlmann (1997) can be competitive in terms of accuracy and computation.…”
Section: Modeling Uncertainty In Deep Neural Network and Uncertainty-...mentioning
confidence: 99%
“…Most existing works applying deep neural networks for autonomous navigation account for epistemic uncertainty only, for instance, by using autoencoders Richter and Roy (2017), dropout and bootstrap Kahn et al (2017); Georgakis et al (2022); Lütjens et al (2019), 2D spatial dropout Amini et al (2017), evidential fusion Liu et al (2021). One of the exceptions is Loquercio et al (2020) which accounts for both uncertainties in the image data using Assumed Density Filtering Ghosh et al (2016) and epistemic uncertainty using MC dropout. Chua et al (2018) propose to use particle propagation to estimate the aleatoric uncertainty and Deep Ensembles to derive the epistemic uncertainty.…”
Section: Modeling Uncertainty In Deep Neural Network and Uncertainty-...mentioning
confidence: 99%