2020
DOI: 10.48550/arxiv.2007.06096
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

BaCOUn: Bayesian Classifers with Out-of-Distribution Uncertainty

Théo Guénais,
Dimitris Vamvourellis,
Yaniv Yacoby
et al.

Abstract: Traditional training of deep classifiers yields overconfident models that are not reliable under dataset shift. We propose a Bayesian framework to obtain reliable uncertainty estimates for deep classifiers. Our approach consists of a plug-in "generator" used to augment the data with an additional class of points that lie on the boundary of the training data, followed by Bayesian inference on top of features that are trained to distinguish these "out-of-distribution" points.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…For larger datasets, more scalable models such as neural linear model (NLM) are used. 23 Assuming linearity, one may also implement Bayesian linear regression for binary/categorical or the generalized linear model for continuous variables. 2. when L r = ∅.…”
Section: E4 MI Samplingmentioning
confidence: 99%
“…For larger datasets, more scalable models such as neural linear model (NLM) are used. 23 Assuming linearity, one may also implement Bayesian linear regression for binary/categorical or the generalized linear model for continuous variables. 2. when L r = ∅.…”
Section: E4 MI Samplingmentioning
confidence: 99%
“…Diversity based approaches include Core-Set [43] which sequentially selects non-semantically similar points. Uncertainty estimation [10,12,13,16,25,31] provides quantitative measures to model ambiguity in the prediction. Bayesian Neural Networks have been traditionally used to estimate uncertainty; however recent approaches [47,53] explore computing uncertainty using a single forward pass through the network.…”
Section: Related Workmentioning
confidence: 99%
“…Application examples: Guénais et al [44] proposed a Bayesian framework to obtain reliable uncertainty estimates for deep classifiers. Their approach consists of a plug-in "generator" used to augment the data with an additional class of points on the boundary of the training data, followed by Bayesian inference on top of features trained to distinguish these "out-of-distribution" points.…”
Section: Introductionmentioning
confidence: 99%