Large Learning Rates Improve Generalization: But How Large Are We Talking About?

Autor: Lobacheva, Ekaterina, Pockonechnyy, Eduard, Kodryan, Maxim, Vetrov, Dmitry
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: Inspired by recent research that recommends starting neural networks training with large learning rates (LRs) to achieve the best generalization, we explore this hypothesis in detail. Our study clarifies the initial LR ranges that provide optimal results for subsequent training with a small LR or weight averaging. We find that these ranges are in fact significantly narrower than generally assumed. We conduct our main experiments in a simplified setup that allows precise control of the learning rate hyperparameter and validate our key findings in a more practical setting.
Comment: Published in Mathematics of Modern Machine Learning Workshop at NeurIPS 2023. First two authors contributed equally
Databáze: arXiv