2023
DOI: 10.48550/arxiv.2302.09656
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Imprecise Bayesian Neural Networks

Abstract: Uncertainty quantification and robustness to distribution shifts are important goals in machine learning and artificial intelligence. Although Bayesian neural networks (BNNs) allow for uncertainty in the predictions to be assessed, different sources of uncertainty are indistinguishable. We present imprecise Bayesian neural networks (IBNNs); they generalize and overcome some of the drawbacks of standard BNNs. These latter are trained using a single prior and likelihood distributions, whereas IBNNs are trained u… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(11 citation statements)
references
References 38 publications
0
11
0
Order By: Relevance
“…Inductive Bias [9], Temperature-scaling Calibration [76], Inhibited Softmax [17], Radial Basis Function Networks (DUQ) [77] (Regression) Quantile Loss Function [78], Virtual Residual pretraining [79], Interval NNs [80] External Gradient Metrics [81][82] [83], Additional NN for Uncertainty [84][42], Spectral-Normalized Gaussian Process (SNGP) [85] Hybrid Softmax & Feature Space Regularization (DDU) [86] Ensembles Deep Ensembles Automated deep ensemble with uncertainty quantification (AutoDEUQ) [87], Combination of Base and Meta model [55], (BNN Ensembles) Bayesian Ensembling [88][89], Approximately Bayesian Ensembling [90] Set-based Conformal Prediction Deep Conformal Prediction [92], Inductive Conformal Prediction [93], Conformal in NN [94], Conformal Prediction in CNN [95] Credal Sets Credal UQ [96], Credal Semi-Supervised Learning [97], Credal Bayesian Neural Network [98] Other Literature Test-time augmentation methods [99][100], Epistemic Neural Networks [56] Table 2) Uncertainty Quantification Techniques and Relevant Literature…”
Section: Deterministic Nnsmentioning
confidence: 99%
See 4 more Smart Citations
“…Inductive Bias [9], Temperature-scaling Calibration [76], Inhibited Softmax [17], Radial Basis Function Networks (DUQ) [77] (Regression) Quantile Loss Function [78], Virtual Residual pretraining [79], Interval NNs [80] External Gradient Metrics [81][82] [83], Additional NN for Uncertainty [84][42], Spectral-Normalized Gaussian Process (SNGP) [85] Hybrid Softmax & Feature Space Regularization (DDU) [86] Ensembles Deep Ensembles Automated deep ensemble with uncertainty quantification (AutoDEUQ) [87], Combination of Base and Meta model [55], (BNN Ensembles) Bayesian Ensembling [88][89], Approximately Bayesian Ensembling [90] Set-based Conformal Prediction Deep Conformal Prediction [92], Inductive Conformal Prediction [93], Conformal in NN [94], Conformal Prediction in CNN [95] Credal Sets Credal UQ [96], Credal Semi-Supervised Learning [97], Credal Bayesian Neural Network [98] Other Literature Test-time augmentation methods [99][100], Epistemic Neural Networks [56] Table 2) Uncertainty Quantification Techniques and Relevant Literature…”
Section: Deterministic Nnsmentioning
confidence: 99%
“…Moreover, the posterior distribution of Standard BNNs doesn't allow for the capturing of epistemic uncertainty for each sample [7]. In more recent literature focusing on uncertainty calibrations, BNNs have been identified as providing overconfident and mis-calibrated uncertainty estimations for OOD samples [98].…”
Section: Bayesian Neural Network and Bayesian Deep Neural Networkmentioning
confidence: 99%
See 3 more Smart Citations