2021
DOI: 10.48550/arxiv.2105.13283
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Ensembles from a Bayesian Perspective

Abstract: Deep ensembles can be seen as the current state-of-the-art for uncertainty quantification in deep learning. While the approach was originally proposed as an non-Bayesian technique, arguments towards its Bayesian footing have been put forward as well. We show that deep ensembles can be viewed as an approximate Bayesian method by specifying the corresponding assumptions. Our finding leads to an improved approximation which results in an increased epistemic part of the uncertainty. Numerical examples suggest that… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3

Relationship

2
6

Authors

Journals

citations
Cited by 12 publications
(14 citation statements)
references
References 28 publications
0
14
0
Order By: Relevance
“…The advantage of BMA is the ability to simultaneously consider all possible predictions given a prior and conditioned on training data, thereby mitigating the risk inherent in trusting the predictions of any single model. A line of recent work considers individual deep ensemble members as samples from p(f |D), and treats the ensemble output as a Monte Carlo approximation to BMA (Hoffmann & Elster, 2021;Wilson & Izmailov, 2020). We also note a line work that emphasizes differences between ensembling and BMA (Minka, 2000;He et al, 2020).…”
Section: Interpretationsmentioning
confidence: 99%
“…The advantage of BMA is the ability to simultaneously consider all possible predictions given a prior and conditioned on training data, thereby mitigating the risk inherent in trusting the predictions of any single model. A line of recent work considers individual deep ensemble members as samples from p(f |D), and treats the ensemble output as a Monte Carlo approximation to BMA (Hoffmann & Elster, 2021;Wilson & Izmailov, 2020). We also note a line work that emphasizes differences between ensembling and BMA (Minka, 2000;He et al, 2020).…”
Section: Interpretationsmentioning
confidence: 99%
“…In contrast to the bagging algorithms discussed above, heterogeneity of deep ensemble members does not stem from bootstrapping the data, but rather from stochasticity in initializing and optimizing the neural networks. Several contributions suggest that deep ensembles benefit prediction performance, uncertainty quantification and out-of-distribution generalization (Fort et al, 2019;Wilson and Izmailov, 2020;Hoffmann and Elster, 2021). Abe et al (2022) question the extent to which these benefits hold.…”
Section: Classical Deep Ensembles Classical Deep Ensembles Combine Pr...mentioning
confidence: 99%
“…In addition to approaches based on variational inference, there are other approaches to quantify the uncertainty about θ [18,[43][44][45]. The approach of deep ensembles [18], introduced as a sort of frequentist approach, can also be viewed as a Bayesian approach [46,47].…”
Section: A Primer On (Deep) Regression and Uncertaintymentioning
confidence: 99%