Empirical Risk Minimization with Relative Entropy Regularization
Autor: | Perlaza, Samir M., Bisson, Gaetan, Esnaola, Iñaki, Jean-Marie, Alain, Rini, Stefano |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Druh dokumentu: | Working Paper |
DOI: | 10.1109/TIT.2024.3365728 |
Popis: | The empirical risk minimization (ERM) problem with relative entropy regularization (ERM-RER) is investigated under the assumption that the reference measure is a $\sigma$-finite measure, and not necessarily a probability measure. Under this assumption, which leads to a generalization of the ERM-RER problem allowing a larger degree of flexibility for incorporating prior knowledge, numerous relevant properties are stated. Among these properties, the solution to this problem, if it exists, is shown to be a unique probability measure, mutually absolutely continuous with the reference measure. Such a solution exhibits a probably-approximately-correct guarantee for the ERM problem independently of whether the latter possesses a solution. For a fixed dataset and under a specific condition, the empirical risk is shown to be a sub-Gaussian random variable when the models are sampled from the solution to the ERM-RER problem. The generalization capabilities of the solution to the ERM-RER problem (the Gibbs algorithm) are studied via the sensitivity of the expected empirical risk to deviations from such a solution towards alternative probability measures. Finally, an interesting connection between sensitivity, generalization error, and lautum information is established. Comment: Appears in IEEE Transactions on Information Theory: Submitted June 2023. Revised in October 2023. Accepted January 2024. CameraReady February 2024. Also available as: Research Report, INRIA, No. RR-9454, Centre Inria d'Universit\'e C\^ote d'Azur, Sophia Antipolis, France, Feb., 2022. Last version: Version 7 |
Databáze: | arXiv |
Externí odkaz: |