Autor: |
Tom Leinster, John C. Baez, Tobias Fritz |
Jazyk: |
angličtina |
Rok vydání: |
2011 |
Předmět: |
|
Zdroj: |
Entropy, Vol 13, Iss 11, Pp 1945-1957 (2011) |
Druh dokumentu: |
article |
ISSN: |
1099-4300 |
DOI: |
10.3390/e13111945 |
Popis: |
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the “information loss”, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|