On graduated optimization for stochastic non-convex problems
Autor: | Hazan, E., Kfir Yehuda Levy, Shalev-Shwartz, S. |
---|---|
Předmět: | |
Zdroj: | Scopus-Elsevier |
Popis: | The graduated optimization approach, also known as the continuation method, is a popular heuristic to solving non-convex problems that has received renewed interest over the last decade. Despite its popularity, very little is known in terms of theoretical convergence analysis. In this paper we describe a new first-order algorithm based on graduated optimiza- tion and analyze its performance. We characterize a parameterized family of non- convex functions for which this algorithm provably converges to a global optimum. In particular, we prove that the algorithm converges to an {\epsilon}-approximate solution within O(1/\epsilon^2) gradient-based steps. We extend our algorithm and analysis to the setting of stochastic non-convex optimization with noisy gradient feedback, attaining the same convergence rate. Additionally, we discuss the setting of zero-order optimization, and devise a a variant of our algorithm which converges at rate of O(d^2/\epsilon^4). Comment: 17 pages |
Databáze: | OpenAIRE |
Externí odkaz: |