This paper gives upper and lower bounds on the minimum error probability of Bayesian Mary hypothesis testing in terms of the Arimoto-Rényi conditional entropy of an arbitrary order α. The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy (α = 1) is demonstrated. In particular, in the case where M is finite, we show how to generalize Fano's inequality under both the conventional and list-decision settings. As a counterpart to the generalized Fano's inequality, allowing M to be infinite, a lower bound on the Arimoto-Rényi conditional entropy is derived as a function of the minimum error probability. Explicit upper and lower bounds on the minimum error probability are obtained as a function of the Arimoto-Rényi conditional entropy for both positive and negative α. Furthermore, we give upper bounds on the minimum error probability as functions of the Rényi divergence. In the setup of discrete memoryless channels, we analyze the exponentially vanishing decay of the Arimoto-Rényi conditional entropy of the transmitted codeword given the channel output when averaged over a random-coding ensemble.
Index TermsArimoto-Rényi conditional entropy, Bayesian minimum probability of error, Chernoff information, Fano's inequality, list decoding, M -ary hypothesis testing, random coding, Rényi entropy, Rényi divergence.
I. Sason is with the Andrew and Erna Viterbi