A new unified framework for designing convex optimization methods with prescribed theoretical convergence estimates: A numerical analysis approach

Autor: Ushiyama, Kansei, Sato, Shun, Matsuo, Takayasu
Rok vydání: 2023
Předmět:
Druh dokumentu: Working Paper
Popis: We propose a new unified framework for describing and designing gradient-based convex optimization methods from a numerical analysis perspective. There the key is the new concept of weak discrete gradients (weak DGs), which is a generalization of DGs standard in numerical analysis. Via weak DG, we consider abstract optimization methods, and prove unified convergence rate estimates that hold independent of the choice of weak DGs except for some constants in the final estimate. With some choices of weak DGs, we can reproduce many popular existing methods, such as the steepest descent and Nesterov's accelerated gradient method, and also some recent variants from numerical analysis community. By considering new weak DGs, we can easily explore new theoretically-guaranteed optimization methods; we show some examples. We believe this work is the first attempt to fully integrate research branches in optimization and numerical analysis areas, so far independently developed.
Databáze: arXiv