Zobrazeno 1 - 6
of 6
pro vyhledávání: '"Williamson, Måns"'
Autor:
Williamson, Måns, Stillfjord, Tony
Gradient normalization and soft clipping are two popular techniques for tackling instability issues and improving convergence of stochastic gradient descent (SGD) with momentum. In this article, we study these types of methods through the lens of dis
Externí odkaz:
http://arxiv.org/abs/2406.16649
Choosing the optimization algorithm that performs best on a given machine learning problem is often delicate, and there is no guarantee that current state-of-the-art algorithms will perform well across all tasks. Consequently, the more reliable metho
Externí odkaz:
http://arxiv.org/abs/2406.16640
Autor:
Stillfjord, Tony, Williamson, Måns
We introduce a family of stochastic optimization methods based on the Runge-Kutta-Chebyshev (RKC) schemes. The RKC methods are explicit methods originally designed for solving stiff ordinary differential equations by ensuring that their stability reg
Externí odkaz:
http://arxiv.org/abs/2201.12782
We consider a stochastic version of the proximal point algorithm for optimization problems posed on a Hilbert space. A typical application of this is supervised learning. While the method is not new, it has not been extensively analyzed in this form.
Externí odkaz:
http://arxiv.org/abs/2010.12348
Autor:
Stillfjord, Tony, Williamson, Måns
Publikováno v:
In Journal of Computational and Applied Mathematics 1 January 2023 417
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.