2020 25th International Conference on Pattern Recognition (ICPR) 2021
DOI: 10.1109/icpr48806.2021.9413010
|View full text |Cite
|
Sign up to set email alerts
|

On-manifold Adversarial Data Augmentation Improves Uncertainty Calibration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 18 publications
(13 citation statements)
references
References 8 publications
0
13
0
Order By: Relevance
“…As has been empirically shown in [280] and [288], the adaptive binning calibration measures aECE and aSCE are more robust to the number of bins than the corresponding equal-width binning calibration measures ECE and SCE.…”
Section: B Evaluating Calibration Qualitymentioning
confidence: 82%
See 2 more Smart Citations
“…As has been empirically shown in [280] and [288], the adaptive binning calibration measures aECE and aSCE are more robust to the number of bins than the corresponding equal-width binning calibration measures ECE and SCE.…”
Section: B Evaluating Calibration Qualitymentioning
confidence: 82%
“…According to [278] [15], [68], [274] 5 [31], [275] 6 [272], [209], [273], [274], [16] 8 [277] 9 [268], [270] 10 [278], [279], [234], [280] 11 [269], [266], [11], [271], [279] Fig. 9: Visualization of the different types of uncertainty calibration methods presented in this paper.…”
Section: A Calibration Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…For automotive applications this creates unacceptable latencies, thus more computationally efficient methods are needed. Sampling-free uncertainty estimation [21] and data augmentation [22], [23] are candidates. In comparison to the methods above that modify the training process, a simple approach that requires no retraining of the models, is post-hoc calibration [2].…”
Section: Related Workmentioning
confidence: 99%
“…However, applications with limited complexity overhead and latency require sampling-free and single-model based calibration methods. Examples include modifying the training loss [14], scalable Gaussian processes [18], sampling-free uncertainty estimation [22], data augmentation [20,25,28,7] and ensemble distribution distillation [17]. In comparison, a simple approach that requires no retraining of the models is post-hoc calibration [5].…”
Section: Related Workmentioning
confidence: 99%