2009
DOI: 10.1109/tit.2008.2008128
|View full text |Cite
|
Sign up to set email alerts
|

Robust Hypothesis Testing With a Relative Entropy Tolerance

Abstract: This paper considers the design of a minimax test for two hypotheses where the actual probability densities of the observations are located in neighborhoods obtained by placing a bound on the relative entropy between actual and nominal densities. The minimax problem admits a saddle point which is characterized. The robust test applies a nonlinear transformation which flattens the nominal likelihood ratio in the vicinity of one. Results are illustrated by considering the transmission of binary data in the prese… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
38
0
1

Year Published

2016
2016
2021
2021

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 71 publications
(40 citation statements)
references
References 12 publications
1
38
0
1
Order By: Relevance
“…The proof of Theorem 3 is given in two steps. First, we show that (7) in Theorem 2 defines, up to some normalization x log(q 1 (x)/q 0 (x)) log(q 1 (x)/q 0 (x)) Fig. 4.…”
Section: Appendix a Proof Of Theoremmentioning
confidence: 97%
See 1 more Smart Citation
“…The proof of Theorem 3 is given in two steps. First, we show that (7) in Theorem 2 defines, up to some normalization x log(q 1 (x)/q 0 (x)) log(q 1 (x)/q 0 (x)) Fig. 4.…”
Section: Appendix a Proof Of Theoremmentioning
confidence: 97%
“…However, in contrast to minimax solutions, such methods do not guarantee pre-specified error probabilities and require changes in the distributions to happen slowly enough for the estimates to be updated. For these reasons, more flexible uncertainty models for minimax robust tests are a topic of ongoing research [7]- [9].…”
Section: Introductionmentioning
confidence: 99%
“…However, modeling errors, which are the other source of uncertainty in signal processing applications, cannot be well modeled by using Huber's techniques [2]. Dabak and Johnson, for the asymptotic case [6], and later Levy for the single sample case [7] suggested that for modeling errors instead of Huber's uncertainty classes the subsets of topological spaces which are created with respect to smooth distances -such as the KL-divergence-are more suitable. The results of [7] are applicable if the nominal density functions under each hypothesis are symmetric, the robustness parameters are equal and the nominal likelihood ratio function is monotone.…”
Section: A Related Workmentioning
confidence: 99%
“…Owing to the mild constraints on f , uncertainty sets of the form (3) offer a great amount of flexibility and have attracted increased attention in recent years. In [5] and [10], minimax optimal tests based on the Kullback-Leibler divergence were derived under varying assumptions. Minimax optimal tests have also been derived for the squared Hellinger distance [6,7], the total variation distance [8], and α-divergences [9,10].…”
Section: Uncertainty Setsmentioning
confidence: 99%
“…A common way of specifying uncertainty sets is via a neighborhood around a nominal distribution, which represents an ideal system state or model [2]. In many works on robust detection, the use of f -divergence balls has been proposed as a useful and versatile model to construct such neighborhoods [3][4][5][6][7][8][9][10]. In contrast to outlier models, such as ε-contamination [11], f -divergence balls do not allow for arbitrarily large deviations from the nominals and, therefore, have been argued to better represent scenarios where the shape of a distribution is subject to uncertainty, but there are no gross outliers in the data [5].…”
Section: Introductionmentioning
confidence: 99%