2020
DOI: 10.3390/s20216011
|View full text |Cite
|
Sign up to set email alerts
|

Measuring the Uncertainty of Predictions in Deep Neural Networks with Variational Inference

Abstract: We present a novel approach for training deep neural networks in a Bayesian way. Compared to other Bayesian deep learning formulations, our approach allows for quantifying the uncertainty in model parameters while only adding very few additional parameters to be optimized. The proposed approach uses variational inference to approximate the intractable a posteriori distribution on basis of a normal prior. By representing the a posteriori uncertainty of the network parameters per network layer and depending on t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

2
5

Authors

Journals

citations
Cited by 17 publications
(21 citation statements)
references
References 26 publications
0
20
0
Order By: Relevance
“…One possibility of describing predictive uncertainty U pred in a label y belonging to an input x with weights w is based on entropy, i.e., U pred = H[y |w, x ] [31]. Another way of quantifying uncertainty in the network parameters of BNNs is presented in [7]. This approach only introduces two uncertainty parameters per network layer, which allows us to grasp uncertainty layer-wise, but it does not impair network convergence.…”
Section: Bayesian Deep Learning and Uncertainty Quantificationmentioning
confidence: 99%
See 3 more Smart Citations
“…One possibility of describing predictive uncertainty U pred in a label y belonging to an input x with weights w is based on entropy, i.e., U pred = H[y |w, x ] [31]. Another way of quantifying uncertainty in the network parameters of BNNs is presented in [7]. This approach only introduces two uncertainty parameters per network layer, which allows us to grasp uncertainty layer-wise, but it does not impair network convergence.…”
Section: Bayesian Deep Learning and Uncertainty Quantificationmentioning
confidence: 99%
“…After having discussed the theoretical background, we describe our Bayesian model and the corresponding variational distribution. The model that we suggest has a similar structure to the framework in [7]. In the remaining section, the subscript indices w and b represent that a quantity is related to the network weights and biases, respectively.…”
Section: Bayesian Pointnetmentioning
confidence: 99%
See 2 more Smart Citations
“…Further ways of uncertainty quantification include the variance of the predictive network outputs and the estimation of (95 %-)credible intervals [ 44 ]. In the former case the predictive variance is estimated empirically using the unbiased estimator for the variance.…”
Section: Data Modellingmentioning
confidence: 99%