2015
DOI: 10.1016/j.neucom.2014.10.092
|View full text |Cite
|
Sign up to set email alerts
|

Efficient rejection strategies for prototype-based classification

Abstract: Due to intuitive training algorithms and model representation, prototypebased models are popular in settings where on-line learning and model interpretability play a major role. In such cases, a crucial property of a classifier is not only which class to predict, but also if a reliable decision is possible in the first place, or whether it is better to reject a decision. While strong theoretical results for optimum reject options in the case of known probability distributions or estimations thereof are availab… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
28
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(28 citation statements)
references
References 30 publications
0
28
0
Order By: Relevance
“…It has been analysed lately that this value serves as an efficient estimation of a confidence for classification with rejection showing similar performance as an explicit probabilistic modelling but at lower computational costs [43]. The calculation of the RelSim for unlabelled data points x is based on the estimated class label with respect to the model, i. e. d…”
Section: Learning Vector Quantizationmentioning
confidence: 99%
“…It has been analysed lately that this value serves as an efficient estimation of a confidence for classification with rejection showing similar performance as an explicit probabilistic modelling but at lower computational costs [43]. The calculation of the RelSim for unlabelled data points x is based on the estimated class label with respect to the model, i. e. d…”
Section: Learning Vector Quantizationmentioning
confidence: 99%
“…The latter case occurs if we observe overlapping class distributions in the training data, i.e. ambiguous regions near the classs borders whereas an example for aleatoric uncertainty are outliers in data [96,97], see Fig. 4.…”
Section: Reject or Classify -Secure Classificationmentioning
confidence: 99%
“…NPC algorithms decrease classification time by reducing the number of instances in the data set [22]. Since the accuracy of such classification techniques is very important, Deng, et al [25] apply transfer learning to prototype-based fuzzy clustering (PFC) in order to solve the problem of limitation and scarcity of data for clustering task, and Fischer, et al [36] present simple and efficient reject options for prototype-based classification to reach a reliable decision.…”
Section: Introductionmentioning
confidence: 99%
“…, expresses preferences of a representativeuser k on the item j and , -closeness of the user to the representative user k. We normalize , (̂, ) to get , ′ (̂, ′) (36) We give the according to the rules:…”
mentioning
confidence: 99%