Convergence of the Iterates for Momentum and RMSProp for Local Smooth Functions: Adaptation is the Key

Autor: Bensaid, Bilel, Poëtte, Gaël, Turpault, Rodolphe
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Both accelerated and adaptive gradient methods are among state of the art algorithms to train neural networks. The tuning of hyperparameters is needed to make them work efficiently. For classical gradient descent, a general and efficient way to adapt hyperparameters is the Armijo backtracking. The goal of this work is to generalize the Armijo linesearch to Momentum and RMSProp, two popular optimizers of this family, by means of stability theory of dynamical systems. We establish convergence results, under the Lojasiewicz assumption, for these strategies. As a direct result, we obtain the first guarantee on the convergence of the iterates for RMSProp, in the non-convex setting without the classical bounded assumptions.
Databáze: arXiv