2019
DOI: 10.48550/arxiv.1905.11001
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On Mixup Training: Improved Calibration and Predictive Uncertainty for Deep Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 22 publications
(34 citation statements)
references
References 0 publications
0
34
0
Order By: Relevance
“…Our work inspires a range of interesting future directions, including theoretical investigations of the trade-offs between accuracy and robustness for NFM and applications of NFM beyond computer vision tasks. Further, it will be interesting to study whether NFM may also lead to better model calibration by extending the analysis of [64,81].…”
Section: Discussionmentioning
confidence: 99%
“…Our work inspires a range of interesting future directions, including theoretical investigations of the trade-offs between accuracy and robustness for NFM and applications of NFM beyond computer vision tasks. Further, it will be interesting to study whether NFM may also lead to better model calibration by extending the analysis of [64,81].…”
Section: Discussionmentioning
confidence: 99%
“…Recently, it was found that although modern machine learning methods, such as neural networks, made significant progress with respect to the predictive accuracy in a variety of learning tasks [30,54,55], their predictive uncertainties are poorly calibrated [24]. To mitigate this issue, several recalibration methods and their variants have been proposed, including scaling approaches [24,48,64], binning approaches [27,47,63], scaling-binning [41], and data augmentation [56,65]. Recent work has begun to develop the theoretical analysis of these methods.…”
Section: Additional Related Workmentioning
confidence: 99%
“…Confidence calibration: Confidence is one possible metric of inference performance, and there are a wealth of confidence calibration approaches in the literature, such as Platt/Temperature scaling [5], Histogram binning [7], Bayesian Binning into Quantiles (BBQ) [9], Isotonic regression [10], and Platt scaling extensions [8]. Several works [6], [11], [16], [17], [18], [19], [20], [21] consider either algorithmic improvements or application specific challenges associated to uncertainty quantification. [22] studies model uncertainty under dataset shift and provides empirical comparison of different calibration techniques.…”
Section: Related Workmentioning
confidence: 99%