2011
DOI: 10.1109/tit.2011.2104670
|View full text |Cite
|
Sign up to set email alerts
|

Universal and Composite Hypothesis Testing via Mismatched Divergence

Abstract: For the universal hypothesis testing problem, where the goal is to decide between the known null hypothesis distribution and some other unknown distribution, Hoeffding proposed a universal test in the nineteen sixties. Hoeffding's universal test statistic can be written in terms of Kullback-Leibler (K-L) divergence between the empirical distribution of the observations and the null hypothesis distribution. In this paper a modification of Hoeffding's test is considered based on a relaxation of the K-L divergenc… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
65
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 41 publications
(66 citation statements)
references
References 25 publications
1
65
0
Order By: Relevance
“…Our upper bound on the computational complexity depends on the diameter of the tree which is defined as (42) where is the length (number of hops) of the unique path between nodes and . For example, for the nonedge in the subtree in Fig.…”
Section: Computational Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…Our upper bound on the computational complexity depends on the diameter of the tree which is defined as (42) where is the length (number of hops) of the unique path between nodes and . For example, for the nonedge in the subtree in Fig.…”
Section: Computational Complexitymentioning
confidence: 99%
“…Subsequent work was done in [40]- [42] albeit for differently defined uncertainty classes known as moment classes.…”
Section: Relation Of the Maximum-likelihood Structure Learning Promentioning
confidence: 99%
“…Thereafter, Efron studied the curvature of statistical manifold [46], Chentsov introduced a family of affine connections [47], and Amari put forward the concept of dual affine connections [48][49][50] which became a weaker distance function by introducing potential function to define the concept of divergence. In recent years, information geometry has been widely applied in various domains, including information theory, system theory, neural network, statistical inference, communication coding, physics and medical imaging [51][52][53][54][55][56][57][58][59][60][61][62]. As a useful tool for studying information metrics, information geometry is considered the second generation of Information Theory [63].…”
Section: About the Metrics Of Informationmentioning
confidence: 99%
“…See also Theorem 3.1.2 of [3] for a version of this result on a finite probability space. The papers [13], [16] contain more background and other applications of this result.…”
Section: Generatormentioning
confidence: 99%
“…The eigenvector equation (16) implies that x Ď (x, x ) = 0 for all x ∈ X, as required for a Markovian generator. Proof of Proposition 2.2: Part (i) is essentially known: From (14) it follows that W * ∞ is the multiplicative-ergodic limit,…”
Section: Generatormentioning
confidence: 99%