Fast learning rates with heavy-tailed losses
Autor: | Dinh, Vu, Ho, Lam Si Tung, Nguyen, Duy, Nguyen, Binh T. |
---|---|
Rok vydání: | 2016 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | We study fast learning rates when the losses are not necessarily bounded and may have a distribution with heavy tails. To enable such analyses, we introduce two new conditions: (i) the envelope function $\sup_{f \in \mathcal{F}}|\ell \circ f|$, where $\ell$ is the loss function and $\mathcal{F}$ is the hypothesis class, exists and is $L^r$-integrable, and (ii) $\ell$ satisfies the multi-scale Bernstein's condition on $\mathcal{F}$. Under these assumptions, we prove that learning rate faster than $O(n^{-1/2})$ can be obtained and, depending on $r$ and the multi-scale Bernstein's powers, can be arbitrarily close to $O(n^{-1})$. We then verify these assumptions and derive fast learning rates for the problem of vector quantization by $k$-means clustering with heavy-tailed distributions. The analyses enable us to obtain novel learning rates that extend and complement existing results in the literature from both theoretical and practical viewpoints. Comment: Advances in Neural Information Processing Systems (NIPS 2016): 11 pages |
Databáze: | arXiv |
Externí odkaz: |