2010
DOI: 10.1016/j.patrec.2010.05.017
|View full text |Cite
|
Sign up to set email alerts
|

Two information-theoretic tools to assess the performance of multi-class classifiers

Abstract: We develop two tools to analyze the behavior of multiple-class, or multi-class, classifiers by means of entropic measures on their confusion matrix or contingency table. First we obtain a balance equation on the entropies that captures interesting properties of the classifier. Second, by normalizing this balance equation we first obtain a 2-simplex in a three-dimensional entropy space and then the de Finetti entropy diagram or entropy triangle. We also give examples of the assessment of classifiers with these … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
3
3

Relationship

3
3

Authors

Journals

citations
Cited by 19 publications
(38 citation statements)
references
References 10 publications
0
38
0
Order By: Relevance
“…34 We argued in [4] for doing this evaluation with the relatively new framework of entropy balance equations and their related entropy triangles [1,4,5]. The gist of this framework is that we can information-theoretically assess the classifier that carried out the prediction and obtained the confusion matrix P KK by analyzing the entropies and informations in the related distribution P KK into the following balance equation [5], H U K ·UK = ∆H P K ·PK + 2 * MI P KK + V I P KK (1) 0 ≤ ∆H P K ·PK , MI P KK , V I P KK ≤ H U K ·UK where U K and UK are the uniform distributions on the supports of P K and PK, respectively, and the 35 information theoretic quantities are: 36 a) the divergence with respect to uniformity, ∆H P K ·PK , between the joint distribution where P K and PK 37 are independent and the uniform distributions with the same cardinality of events as P K and PK , 38 b) the mutual information, MI P KK [6,7], quantifying the strength of the stochastic binding between P K…”
Section: (B)mentioning
confidence: 99%
“…34 We argued in [4] for doing this evaluation with the relatively new framework of entropy balance equations and their related entropy triangles [1,4,5]. The gist of this framework is that we can information-theoretically assess the classifier that carried out the prediction and obtained the confusion matrix P KK by analyzing the entropies and informations in the related distribution P KK into the following balance equation [5], H U K ·UK = ∆H P K ·PK + 2 * MI P KK + V I P KK (1) 0 ≤ ∆H P K ·PK , MI P KK , V I P KK ≤ H U K ·UK where U K and UK are the uniform distributions on the supports of P K and PK, respectively, and the 35 information theoretic quantities are: 36 a) the divergence with respect to uniformity, ∆H P K ·PK , between the joint distribution where P K and PK 37 are independent and the uniform distributions with the same cardinality of events as P K and PK , 38 b) the mutual information, MI P KK [6,7], quantifying the strength of the stochastic binding between P K…”
Section: (B)mentioning
confidence: 99%
“…The entropy triangle is a contingency matrix visualization tool based on an often overlooked decomposition of the joint entropy of two random variables [4]. Figure 1 shows such a decomposition showing the three crucial regions: -The mutual information,…”
Section: The Entropy Triangle: a Visualization Toolmentioning
confidence: 99%
“…Those at or close to the right vertex are not doing any job on very easy data for which they claim to have very high accuracy: they are specialized (majority) classifier s and our intuition is that they are the kind of classifiers that generate the accuracy paradox [1]. In just this guise, the ET has already been successfully used in the evaluation of Speech Recognition systems [4,7]. But a simple extension of the ET is to endow it with a graduated axis or colormap that also allows us to visualize the correlation of such information-theoretic measures with other measures like accuracy, greatly enhancing its usefulness.…”
Section: Et)mentioning
confidence: 99%
See 2 more Smart Citations