Randomized learning: Generalization performance of old and new theoretically grounded algorithms
Autor: | Sandro Ridella, Davide Anguita, Luca Oneto, Francesca Cipollini |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2018 |
Předmět: |
0209 industrial biotechnology
Current (mathematics) Data dependent posterior Computer science Generalization Cognitive Neuroscience Generalization performances Randomized learning algorithms Context (language use) 02 engineering and technology 020901 industrial engineering & automation Artificial Intelligence 0202 electrical engineering electronic engineering information engineering Differential privacy Distribution dependent prior PAC-Bayes Randomized models Computer Science Applications1707 Computer Vision and Pattern Recognition Learning Generalization Series (mathematics) Computer Science Applications Randomized algorithm Rate of convergence 020201 artificial intelligence & image processing Algorithm |
Popis: | In the context of assessing the generalization abilities of a randomized model or learning algorithm, PAC-Bayes and Differential Privacy (DP) theories are the state-of-the-art tools. For this reason, in this paper, we will develop tight DP-based generalization bounds, which improve over the current state-of-the-art ones both in terms of constants and rate of convergence. Moreover, we will also prove that some old and new randomized algorithm, show better generalization performances with respect to their non private counterpart, if the DP is exploited for assessing their generalization ability. Results on a series of algorithms and real world problems show the practical validity of the achieved theoretical results. |
Databáze: | OpenAIRE |
Externí odkaz: |