Zobrazeno 1 - 10
of 150
pro vyhledávání: '"Lecué, Guillaume"'
We obtain upper bounds for the estimation error of Kernel Ridge Regression (KRR) for all non-negative regularization parameters, offering a geometric perspective on various phenomena in KRR. As applications: 1. We address the multiple descent problem
Externí odkaz:
http://arxiv.org/abs/2404.07709
Autor:
Lecué, Guillaume, Neirac, Lucie
Motivated by several examples, we consider a general framework of learning with linear loss functions. In this context, we provide excess risk and estimation bounds that hold with large probability for four estimators: ERM, minmax MOM and their regul
Externí odkaz:
http://arxiv.org/abs/2310.17293
Autor:
Lecué, Guillaume, Shang, Zong
In the linear regression model, the minimum l2-norm interpolant estimator has received much attention since it was proved to be consistent even though it fits noisy data perfectly under some condition on the covariance matrix $\Sigma$ of the input ve
Externí odkaz:
http://arxiv.org/abs/2203.05873
Autor:
Depersin, Jules, Lecué, Guillaume
We consider the problem of robust mean and location estimation w.r.t. any pseudo-norm of the form $x\in\mathbb{R}^d\to ||x||_S = \sup_{v\in S}$ where $S$ is any symmetric subset of $\mathbb{R}^d$. We show that the deviation-optimal minimax subga
Externí odkaz:
http://arxiv.org/abs/2102.00995
Autor:
Depersin, Jules, Lecué, Guillaume
We consider median of means (MOM) versions of the Stahel-Donoho outlyingness (SDO) [stahel 1981, donoho 1982] and of Median Absolute Deviation (MAD) functions to construct subgaussian estimators of a mean vector under adversarial contamination and he
Externí odkaz:
http://arxiv.org/abs/2101.09117
Many statistical learning problems have recently been shown to be amenable to Semi-Definite Programming (SDP), with community detection and clustering in Gaussian mixture models as the most striking instances [javanmard et al., 2016]. Given the growi
Externí odkaz:
http://arxiv.org/abs/2004.01869
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Depersin, Jules, Lecué, Guillaume
We construct an algorithm, running in time $\tilde{\mathcal O}(N d + uK d)$, which is robust to outliers and heavy-tailed data and which achieves the subgaussian rate from [Lugosi, Mendelson] \begin{equation}\label{eq:intro_subgaus_rate} \sqrt{\frac{
Externí odkaz:
http://arxiv.org/abs/1906.03058
We establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when the loss is Lipschitz and convex and the regularization function is a norm. In a first part, we obtain these results in the i.i.d. setup under subgaussian assumptions on t
Externí odkaz:
http://arxiv.org/abs/1905.04281
Hyperparameters tuning and model selection are important steps in machine learning. Unfortunately, classical hyperparameter calibration and model selection procedures are sensitive to outliers and heavy-tailed data. In this work, we construct a selec
Externí odkaz:
http://arxiv.org/abs/1812.02435