Linear Convergence of Descent Methods for the Unconstrained Minimization of Restricted Strongly Convex Functions
Autor: | Frank Schöpfer |
---|---|
Rok vydání: | 2016 |
Předmět: |
Hessian matrix
0209 industrial biotechnology 021103 operations research 0211 other engineering and technologies 02 engineering and technology Theoretical Computer Science Combinatorics symbols.namesake 020901 industrial engineering & automation Stochastic gradient descent Rate of convergence Broyden–Fletcher–Goldfarb–Shanno algorithm Conjugate gradient method symbols Applied mathematics Descent direction Gradient descent Convex function Software Mathematics |
Zdroj: | SIAM Journal on Optimization. 26:1883-1911 |
ISSN: | 1095-7189 1052-6234 |
DOI: | 10.1137/140992990 |
Popis: | Linear convergence rates of descent methods for unconstrained minimization are usually proved under the assumption that the objective function is strongly convex. Recently it was shown that the weaker assumption of restricted strong convexity suffices for linear convergence of the ordinary gradient descent method. A decisive difference to strong convexity is that the set of minimizers of a restricted strongly convex function need be neither a singleton nor bounded. In this paper we extend the linear convergence results under this weaker assumption to a larger class of descent methods including restarted nonlinear CG, BFGS, and its damped limited memory variants, L-D-BFGS. For twice continuously differentiable objective functions we even obtain r-step superlinear convergence for the CG_DESCENT conjugate gradient method of Hager and Zhang, where r is greater than or equal to the rank of the Hessian at a minimizer. This is remarkable since the Hessian of a restricted strongly convex function need not have fu... |
Databáze: | OpenAIRE |
Externí odkaz: |