Zobrazeno 1 - 2
of 2
pro vyhledávání: '"Kreisler, Itai"'
We propose a method that achieves near-optimal rates for smooth stochastic convex optimization and requires essentially no prior knowledge of problem parameters. This improves on prior work which requires knowing at least the initial distance to opti
Externí odkaz:
http://arxiv.org/abs/2404.00666
Recent research shows that when Gradient Descent (GD) is applied to neural networks, the loss almost never decreases monotonically. Instead, the loss oscillates as gradient descent converges to its ''Edge of Stability'' (EoS). Here, we find a quantit
Externí odkaz:
http://arxiv.org/abs/2305.13064