2020 IEEE International Symposium on Information Theory (ISIT) 2020
DOI: 10.1109/isit44484.2020.9174520
|View full text |Cite
|
Sign up to set email alerts
|

On Binary Statistical Classification from Mismatched Empirically Observed Statistics

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 8 publications
0
4
0
Order By: Relevance
“…In more detail, this condition is required in the proof of the converse part. This condition was also imposed by Gutman [3], Hsu and Wang [4], and Zhou et al [5].…”
Section: Maximum Error Exponentmentioning
confidence: 96%
See 3 more Smart Citations
“…In more detail, this condition is required in the proof of the converse part. This condition was also imposed by Gutman [3], Hsu and Wang [4], and Zhou et al [5].…”
Section: Maximum Error Exponentmentioning
confidence: 96%
“…To this end, we generalize Gutman's classifier [3], which was shown to be first-and second-order optimal for memoryless sources (with no multiple subclasses) in [5]. This classifier uses training sequences from one of the two sources, as in [3][4][5], making a type-based decision for a source (subclass) with the smallest skewed Jensen-Shannon divergence [7] among the subclasses. We show that this classifier asymptotically achieves the maximum type-II error exponent in the class of deterministic classifiers for a given pair of distributions when the type-I error probability decays exponentially fast for all pairs of distributions in Theorem 1.…”
Section: Contributionsmentioning
confidence: 99%
See 2 more Smart Citations