2012
DOI: 10.1109/tit.2011.2171533
|View full text |Cite
|
Sign up to set email alerts
|

A Generalized Poor-Verdú Error Bound for Multihypothesis Testing

Abstract: A lower bound on the minimum error probability for multihypothesis testing is established. The bound, which is expressed in terms of the cumulative distribution function of the tilted posterior hypothesis distribution given the observation with tilting parameter , generalizes an earlier bound due the Poor and Verdú (1995). A sufficient condition is established under which the new bound (minus a multiplicative factor) provides the exact error probability asymptotically in . Examples illustrating the new bound a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(19 citation statements)
references
References 8 publications
0
19
0
Order By: Relevance
“…13) One of the bounds by Han and Verdú [34] in Item 8) was generalized by Polyanskiy and Verdú [56] to give a lower bound on the α-mutual information ( [74], [85]). 14) In [73], Shayevitz gave a lower bound, in terms of the Rényi divergence, on the maximal worst-case missdetection exponent for a binary composite hypothesis testing problem when the false-alarm probability decays to zero with the number of i.i.d. observations.…”
Section: Existing Bounds Involving Rényi's Information Measures Anmentioning
confidence: 99%
See 1 more Smart Citation
“…13) One of the bounds by Han and Verdú [34] in Item 8) was generalized by Polyanskiy and Verdú [56] to give a lower bound on the α-mutual information ( [74], [85]). 14) In [73], Shayevitz gave a lower bound, in terms of the Rényi divergence, on the maximal worst-case missdetection exponent for a binary composite hypothesis testing problem when the false-alarm probability decays to zero with the number of i.i.d. observations.…”
Section: Existing Bounds Involving Rényi's Information Measures Anmentioning
confidence: 99%
“…where the outer expectations are with respect to Y ∼ P Y ; (284) follows from (14) and the equality βρ β−1 = α α−1 (see (282) and (283)); (285) follows from β β−1 > 0, 1 β > 0, Jensen's inequality and the concavity of t ρ on [0, ∞); (286) follows from (283) and (282); (287) holds due to the concavity of t θ on [0, ∞) and Jensen's inequality; (288) follows from (282); (289) follows from (14).…”
Section: Appendix a Proof Of Propositionmentioning
confidence: 99%
“…Note that for a given x, as n increases p n (θ m |x) approaches 1 from below for m = m * (x) = argmax m {p(θ m |x)} and 0 from above for all other m values. This increasing purification of the posterior p n (Θ|X), also known as the tilted posterior PMF of order n [14], with increasing n also implies a decreasing value of the GEE.…”
Section: B Feder-merhav and Related Upper Bounds On The Mpementioning
confidence: 94%
“…For the special choice, s m (X) = p(θ m |X), the RHS of inequality (14) reduces to −H(Θ|X), which is the FM bound. A more general choice involves the generalized posterior PMF, p n (θ m |X), defined earlier in (7), that becomes sharper the larger its order n,…”
Section: Other Derivations Of the Fm Bound And New Upper Bounds mentioning
confidence: 99%
See 1 more Smart Citation