SGD with Clipping is Secretly Estimating the Median Gradient

Autor: Schaipp, Fabian, Garrigos, Guillaume, Simsekli, Umut, Gower, Robert
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: There are several applications of stochastic optimization where one can benefit from a robust estimate of the gradient. For example, domains such as distributed learning with corrupted nodes, the presence of large outliers in the training data, learning under privacy constraints, or even heavy-tailed noise due to the dynamics of the algorithm itself. Here we study SGD with robust gradient estimators based on estimating the median. We first consider computing the median gradient across samples, and show that the resulting method can converge even under heavy-tailed, state-dependent noise. We then derive iterative methods based on the stochastic proximal point method for computing the geometric median and generalizations thereof. Finally we propose an algorithm estimating the median gradient across iterations, and find that several well known methods - in particular different forms of clipping - are particular cases of this framework.
Databáze: arXiv