Arimoto–Rényi Conditional Entropy and Bayesian $M$ -Ary Hypothesis Testing
Autor: | Igal Sason, Sergio Verdu |
---|---|
Rok vydání: | 2018 |
Předmět: |
Conditional entropy
Discrete mathematics Bayesian probability List decoding 020206 networking & telecommunications 02 engineering and technology Library and Information Sciences Upper and lower bounds Computer Science Applications Rényi entropy 0202 electrical engineering electronic engineering information engineering Entropy (information theory) 020201 artificial intelligence & image processing Fano's inequality Computer Science::Information Theory Information Systems Mathematics Statistical hypothesis testing |
Zdroj: | IEEE Transactions on Information Theory. 64:4-25 |
ISSN: | 1557-9654 0018-9448 |
DOI: | 10.1109/tit.2017.2757496 |
Popis: | This paper gives upper and lower bounds on the minimum error probability of Bayesian $M$ -ary hypothesis testing in terms of the Arimoto–Renyi conditional entropy of an arbitrary order $\alpha $ . The improved tightness of these bounds over their specialized versions with the Shannon conditional entropy ( $\alpha =1$ ) is demonstrated. In particular, in the case where $M$ is finite, we show how to generalize Fano’s inequality under both the conventional and list-decision settings. As a counterpart to the generalized Fano’s inequality, allowing $M$ to be infinite, a lower bound on the Arimoto–Renyi conditional entropy is derived as a function of the minimum error probability. Explicit upper and lower bounds on the minimum error probability are obtained as a function of the Arimoto–Renyi conditional entropy for both positive and negative $\alpha $ . Furthermore, we give upper bounds on the minimum error probability as functions of the Renyi divergence. In the setup of discrete memoryless channels, we analyze the exponentially vanishing decay of the Arimoto–Renyi conditional entropy of the transmitted codeword given the channel output when averaged over a random-coding ensemble. |
Databáze: | OpenAIRE |
Externí odkaz: |