Sketched Newton-Raphson
Autor: | Rui Yuan, Alessandro Lazaric, Robert M. Gower |
---|---|
Rok vydání: | 2020 |
Předmět: | |
DOI: | 10.48550/arxiv.2006.12120 |
Popis: | We propose a new globally convergent stochastic second order method. Our starting point is the development of a new Sketched Newton-Raphson (SNR) method for solving large scale nonlinear equations of the form $F(x)=0$ with $F:\mathbb{R}^p \rightarrow \mathbb{R}^m$. We then show how to design several stochastic second order optimization methods by re-writing the optimization problem of interest as a system of nonlinear equations and applying SNR. For instance, by applying SNR to find a stationary point of a generalized linear model (GLM), we derive completely new and scalable stochastic second order methods. We show that the resulting method is very competitive as compared to state-of-the-art variance reduced methods. Furthermore, using a variable splitting trick, we also show that the Stochastic Newton method (SNM) is a special case of SNR, and use this connection to establish the first global convergence theory of SNM. We establish the global convergence of SNR by showing that it is a variant of the stochastic gradient descent (SGD) method, and then leveraging proof techniques of SGD. As a special case, our theory also provides a new global convergence theory for the original Newton-Raphson method under strictly weaker assumptions as compared to the classic monotone convergence theory. Comment: Accepted for SIAM Journal on Optimization. 47 pages, 4 figures |
Databáze: | OpenAIRE |
Externí odkaz: |