Autor: |
Thierrin, Ferenc Cole, Alajaji, Fady, Linder, Tamás |
Rok vydání: |
2022 |
Předmět: |
|
Zdroj: |
Entropy, Vol. 24, Issue, 10, October 2022 |
Druh dokumentu: |
Working Paper |
DOI: |
10.3390/e24101417 |
Popis: |
Two R\'{e}nyi-type generalizations of the Shannon cross-entropy, the R\'{e}nyi cross-entropy and the Natural R\'{e}nyi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we build upon our results in [1] by deriving the R\'{e}nyi and Natural R\'{e}nyi differential cross-entropy measures in closed form for a wide class of common continuous distributions belonging to the exponential family and tabulating the results for ease of reference. We also summarise the R\'{e}nyi-type cross-entropy rates between stationary Gaussian processes and between finite-alphabet time-invariant Markov sources. |
Databáze: |
arXiv |
Externí odkaz: |
|