2020
DOI: 10.3390/make2040028
|View full text |Cite
|
Sign up to set email alerts
|

Towards Knowledge Uncertainty Estimation for Open Set Recognition

Abstract: Uncertainty is ubiquitous and happens in every single prediction of Machine Learning models. The ability to estimate and quantify the uncertainty of individual predictions is arguably relevant, all the more in safety-critical applications. Real-world recognition poses multiple challenges since a model’s knowledge about physical phenomenon is not complete, and observations are incomplete by definition. However, Machine Learning algorithms often assume that train and test data distributions are the same and that… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 45 publications
0
4
0
Order By: Relevance
“…To define τ k , we used a 95% value of the training uncertainty values, meaning that τ k = P 95% [KUE]. A detailed description of this approach is available in [24].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To define τ k , we used a 95% value of the training uncertainty values, meaning that τ k = P 95% [KUE]. A detailed description of this approach is available in [24].…”
Section: Methodsmentioning
confidence: 99%
“…Traditional methods, such as Kernel Density Estimation (KDE), can be used to estimate densities, and often, threshold-based methods are applied on top of the density where a classifier can refuse to predict a test input in that region [23]. In this context, Knowledge Uncertainty Estimation (KUE) [24] learns the feature density estimation from the training data, to reject test inputs that represent a density different from the training dataset. For a test input x i , represented by P-dimensional feature vectors, where f j ∈ { f 1 , .…”
Section: Uncertainty Quantificationmentioning
confidence: 99%
“…There are a number of methods, like this work, which focus on the inference process. KNN and distance metrics for anomaly detection have been studied previously [18,5,26], and recently have seen use in OOD and adversarial detection [35,1,39]. Furthermore, the notion of rejecting classification has been studied in depth [14,22,4].…”
Section: Related Workmentioning
confidence: 99%
“…These concepts primarily deal with model uncertainty [22][23][24] (inverse propagation problem) over measurement uncertainty [25]. Recently, researchers have had increased interest in quantifying the uncertainty related to ML models [26][27][28]. In the systematic literature reviews of ML and deep learning applications in smart cities [11], and the COVID-19 epidemic [12], the accuracy parameter received the most attention from the reviewed articles (28.1% and 48.8%, respectively).…”
Section: Introductionmentioning
confidence: 99%