Autor: |
Györfi L; Department of Computer Science and Information Theory, Budapest University of Technology and Economics, H-1111 Budapest, Hungary., Linder T; Department of Mathematics and Statistics, Queen's University, Kingston, ON K7L 3N6, Canada., Walk H; Fachbereich Mathematik, Universität Stuttgart, 70569 Stuttgart, Germany. |
Jazyk: |
angličtina |
Zdroj: |
Entropy (Basel, Switzerland) [Entropy (Basel)] 2023 Sep 28; Vol. 25 (10). Date of Electronic Publication: 2023 Sep 28. |
DOI: |
10.3390/e25101394 |
Abstrakt: |
We study the excess minimum risk in statistical inference, defined as the difference between the minimum expected loss when estimating a random variable from an observed feature vector and the minimum expected loss when estimating the same random variable from a transformation (statistic) of the feature vector. After characterizing lossless transformations, i.e., transformations for which the excess risk is zero for all loss functions, we construct a partitioning test statistic for the hypothesis that a given transformation is lossless, and we show that for i.i.d. data the test is strongly consistent. More generally, we develop information-theoretic upper bounds on the excess risk that uniformly hold over fairly general classes of loss functions. Based on these bounds, we introduce the notion of a δ-lossless transformation and give sufficient conditions for a given transformation to be universally δ-lossless. Applications to classification, nonparametric regression, portfolio strategies, information bottlenecks, and deep learning are also surveyed. |
Databáze: |
MEDLINE |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|