Advances in Low-Memory Subgradient Optimization

Autor: Alexander Gasnikov, Pavel Dvurechensky, E.A. Nurminski, Fedor Stonyakin
Jazyk: angličtina
Rok vydání: 2019
Předmět:
Zdroj: Numerical Nonsmooth Optimization ISBN: 9783030349097
Popis: This chapter is devoted to the black-box subgradient algorithms with the minimal requirements for the storage of auxiliary results, which are necessary to execute these algorithms. It starts with the original result of N.Z. Shor which open this field with the application to the classical transportation problem. To discuss the fundamentals of non-smooth optimization the theoretical complexity bounds for smooth and non-smooth convex and quasi-convex optimization problems are briefly exposed with the special attention given to adaptive step-size policy. Than this chapter contains descriptions of different modern techniques that allow to solve non-smooth convex optimization problems faster then lower complexity bounds: Netserov smoothing technique, Netserov Universal approach, Legendre (saddle point) representation approach. We also describe recent Universal Mirror Prox algorithm for variational inequalities. To demonstrate application of non-smooth convex optimization algorithms for solution of huge-scale extremal problems we consider convex optimization problems with non-smooth functional constraints and describe two adaptive Mirror Descent methods. The first method is of primal-dual nature and proved to be optimal in terms of lower oracle bounds for the class of Lipschitz-continuous convex objective and constraints. The advantages of application of this method to sparse Truss Topology Design problem are discussed in some details. The second method can be applied for solution of convex and quasi-convex optimization problems and is optimal in a sense of complexity bounds. The conclusion part of the survey contains the important references that characterize recent developments of non-smooth convex optimization.
Lecture Notes in Computer Science, 2019; 36 pages
Databáze: OpenAIRE