Autor: |
Guedj B; Centre for Artificial Intelligence, Department of Computer Science, University College London, London WC1V 6LJ, UK.; Inria Lille-Nord Europe Research Centre and Inria London, 59800 Lille, France., Pujol L; Laboratoire de Mathématiques d'Orsay, Université Paris-Saclay, CNRS, 91405 Orsay, France. |
Jazyk: |
angličtina |
Zdroj: |
Entropy (Basel, Switzerland) [Entropy (Basel)] 2021 Nov 18; Vol. 23 (11). Date of Electronic Publication: 2021 Nov 18. |
DOI: |
10.3390/e23111529 |
Abstrakt: |
"No free lunch" results state the impossibility of obtaining meaningful bounds on the error of a learning algorithm without prior assumptions and modelling, which is more or less realistic for a given problem. Some models are "expensive" (strong assumptions, such as sub-Gaussian tails), others are "cheap" (simply finite variance). As it is well known, the more you pay, the more you get: in other words, the most expensive models yield the more interesting bounds. Recent advances in robust statistics have investigated procedures to obtain tight bounds while keeping the cost of assumptions minimal. The present paper explores and exhibits what the limits are for obtaining tight probably approximately correct (PAC)-Bayes bounds in a robust setting for cheap models. |
Databáze: |
MEDLINE |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|