2007
DOI: 10.1007/978-3-540-76725-1_41
|View full text |Cite
|
Sign up to set email alerts
|

Confusion Matrix Disagreement for Multiple Classifiers

Abstract: We present a methodology to analyze Multiple Classifiers Systems (MCS) performance, using the disagreement concept. The goal is to define an alternative approach to the conventional recognition rate criterion, which usually requires an exhaustive combination search. This approach defines a Distance-based Disagreement (DbD) measure using an Euclidean distance computed between confusion matrices and a soft-correlation rule to indicate the most likely candidates to the best classifiers ensemble. As case study, we… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0
1

Year Published

2010
2010
2021
2021

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 16 publications
(10 citation statements)
references
References 13 publications
0
9
0
1
Order By: Relevance
“…Therefore, lacking a biologically-plausible similarity measure, we use the distance between the confusion matrices of the human observers and the learned model (cf. [16]). The model by Weijer et al has a distance of 0.73 whereas our model has a distance of 0.57.…”
Section: B Resultsmentioning
confidence: 93%
“…Therefore, lacking a biologically-plausible similarity measure, we use the distance between the confusion matrices of the human observers and the learned model (cf. [16]). The model by Weijer et al has a distance of 0.73 whereas our model has a distance of 0.57.…”
Section: B Resultsmentioning
confidence: 93%
“…The configuration matrix is a table consisting of the number of rows of test data that are predicted to be true and false by the classification model used. A confusion matrix table is needed to select the best performance from a classification model [19]. This method is often used with multiple classifiers or more than two classes.…”
Section: E Confusion Matrixmentioning
confidence: 99%
“…The accuracy of a classifier can be easily obtained from its corresponding confusion matrix. In their work (Freitas, de Carvalho, & Oliveira, 2007), the authors defined a simple measure on confusion matrix, i.e. the global performance index, for evaluating performances of ensemble classifiers.…”
Section: Some Measures Defined On Confusion Matrixmentioning
confidence: 99%