2019
DOI: 10.1016/j.patcog.2019.106984
|View full text |Cite
|
Sign up to set email alerts
|

Performance visualization spaces for classification with rejection option

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 19 publications
(26 citation statements)
references
References 18 publications
0
26
0
Order By: Relevance
“…Furthermore, the second part of Theorem 1 proves that the composite classifier's expected conditional error rate (E[E c D ]) can be expressed in terms of the unconditional error and coverage rates of classifiers A and B, mirroring the relationship in Equation 5 (E c = E u C ). Theorem 2 also proves that this simple formulation for E[E c D ] is equivalent to the non-linear interpolation scheme for conditional error proposed by Hanczar [4]. Note that while their notation interpolates between classifiers X and 0 to produce a classifier 0 + x, we continue to refer to these classifiers as A, B, and D respectively.…”
Section: Proofmentioning
confidence: 67%
See 4 more Smart Citations
“…Furthermore, the second part of Theorem 1 proves that the composite classifier's expected conditional error rate (E[E c D ]) can be expressed in terms of the unconditional error and coverage rates of classifiers A and B, mirroring the relationship in Equation 5 (E c = E u C ). Theorem 2 also proves that this simple formulation for E[E c D ] is equivalent to the non-linear interpolation scheme for conditional error proposed by Hanczar [4]. Note that while their notation interpolates between classifiers X and 0 to produce a classifier 0 + x, we continue to refer to these classifiers as A, B, and D respectively.…”
Section: Proofmentioning
confidence: 67%
“…1d to select their preferred classifier. Alternatively, if the cost of instance rejection (λ r ) can be specified in the range (0, 1) (with 0 denoting the cost of a correct classification and 1 representing the cost of a misclassification), then the cost-optimal classifier can be found by selecting the classifier with minimal expected cost C as defined in Equation 3 (from [4]) and applied in lines 15-17 of Algorithm 1.…”
Section: Proposed Null-labelling Methods For Rejectionmentioning
confidence: 99%
See 3 more Smart Citations