Zobrazeno 1 - 3
of 3
pro vyhledávání: '"Parletta, Daniela A."'
In this paper, we provide novel optimal (or near optimal) convergence rates in expectation for the last iterate of a clipped version of the stochastic subgradient method. We consider nonsmooth convex problems, over possibly unbounded domains, under h
Externí odkaz:
http://arxiv.org/abs/2410.00573
In this work we study high probability bounds for stochastic subgradient methods under heavy tailed noise. In this setting the noise is only assumed to have finite variance as opposed to a sub-Gaussian distribution for which it is known that standard
Externí odkaz:
http://arxiv.org/abs/2208.08567
Designing learning algorithms that are resistant to perturbations of the underlying data distribution is a problem of wide practical and theoretical importance. We present a general approach to this problem focusing on unsupervised learning. The key
Externí odkaz:
http://arxiv.org/abs/2012.07399