Level-set Subdifferential Error Bounds and Linear Convergence of Variable Bregman Proximal Gradient Method
Autor: | Zhu, Daoli, Deng, Sien, Li, Minghua, Zhao, Lei |
---|---|
Rok vydání: | 2020 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | In this work, we develop a level-set subdifferential error bound condition aiming towards convergence rate analysis of a variable Bregman proximal gradient (VBPG) method for a broad class of nonsmooth and nonconvex optimization problems. It is proved that the aforementioned condition guarantees linear convergence of VBPG, and is weaker than Kurdyka-Lojasiewicz property, weak metric subregularity and Bregman proximal error bound. Along the way, we are able to derive a number of verifiable conditions for level-set subdifferential error bounds to hold, and necessary conditions and sufficient conditions for linear convergence relative to a level set for nonsmooth and nonconvex optimization problems. The newly established results not only enable us to show that any accumulation point of the sequence generated by VBPG is at least a critical point of the limiting subdifferential or even acritical point of the proximal subdifferential with a fixed Bregman function in each iteration, but also provide a fresh perspective that allows us to explore inner-connections among many known sufficient conditions for linear convergence of various first-order methods. Comment: arXiv admin note: substantial text overlap with arXiv:1905.08445 |
Databáze: | arXiv |
Externí odkaz: |