A Tight Upper Bound on Mutual Information
Autor: | Michal Hledik, Gašper Tkačik, Thomas R. Sokolowski |
---|---|
Jazyk: | angličtina |
Předmět: |
Decodes
FOS: Computer and information sciences Computer science Computer Science - Information Theory Equivocation 02 engineering and technology Data_CODINGANDINFORMATIONTHEORY Upper and lower bounds 03 medical and health sciences Joint probability distribution 0202 electrical engineering electronic engineering information engineering Maximum a posteriori estimation 030304 developmental biology Computer Science::Information Theory Conditional entropy 0303 health sciences biology Information Theory (cs.IT) 020206 networking & telecommunications Mutual information 16. Peace & justice biology.organism_classification FOS: Biological sciences Quantitative Biology - Neurons and Cognition Neurons and Cognition (q-bio.NC) Algorithm Communication channel |
Zdroj: | 2019 IEEE Information Theory Workshop (ITW) ITW |
DOI: | 10.1109/itw44776.2019.8989292 |
Popis: | We derive a tight lower bound on equivocation (conditional entropy), or equivalently a tight upper bound on mutual information between a signal variable and channel outputs. The bound is in terms of the joint distribution of the signals and maximum a posteriori decodes (most probable signals given channel output). As part of our derivation, we describe the key properties of the distribution of signals, channel outputs and decodes, that minimizes equivocation and maximizes mutual information. This work addresses a problem in data analysis, where mutual information between signals and decodes is sometimes used to lower bound the mutual information between signals and channel outputs. Our result provides a corresponding upper bound. 6 pages, 3 figures; proof illustration added |
Databáze: | OpenAIRE |
Externí odkaz: |