Second-Order Information in Non-Convex Stochastic Optimization: Power and Limitations
Autor: | Arjevani, Yossi, Carmon, Yair, Duchi, John C., Foster, Dylan J., Sekhari, Ayush, Sridharan, Karthik |
---|---|
Rok vydání: | 2020 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | We design an algorithm which finds an $\epsilon$-approximate stationary point (with $\|\nabla F(x)\|\le \epsilon$) using $O(\epsilon^{-3})$ stochastic gradient and Hessian-vector products, matching guarantees that were previously available only under a stronger assumption of access to multiple queries with the same random seed. We prove a lower bound which establishes that this rate is optimal and---surprisingly---that it cannot be improved using stochastic $p$th order methods for any $p\ge 2$, even when the first $p$ derivatives of the objective are Lipschitz. Together, these results characterize the complexity of non-convex stochastic optimization with second-order methods and beyond. Expanding our scope to the oracle complexity of finding $(\epsilon,\gamma)$-approximate second-order stationary points, we establish nearly matching upper and lower bounds for stochastic second-order methods. Our lower bounds here are novel even in the noiseless case. Comment: Accepted to CONFERENCE ON LEARNING THEORY (COLT) 2020 |
Databáze: | arXiv |
Externí odkaz: |