Zobrazeno 1 - 10
of 127
pro vyhledávání: '"Zhu, Daoli"'
In this paper we propose a proximal subgradient method (Prox-SubGrad) for solving nonconvex and nonsmooth optimization problems without assuming Lipschitz continuity conditions. A number of subgradient upper bounds and their relationships are present
Externí odkaz:
http://arxiv.org/abs/2308.16362
In this paper we consider a non-monotone (mixed) variational inequality model with (nonlinear) convex conic constraints. Through developing an equivalent Lagrangian function-like primal-dual saddle-point system for the VI model in question, we introd
Externí odkaz:
http://arxiv.org/abs/2306.01214
The subgradient method is one of the most fundamental algorithmic schemes for nonsmooth optimization. The existing complexity and convergence results for this method are mainly derived for Lipschitz continuous objective functions. In this work, we fi
Externí odkaz:
http://arxiv.org/abs/2305.14161
Coordinate-type subgradient methods for addressing nonsmooth optimization problems are relatively underexplored due to the set-valued nature of the subdifferential. In this work, our study focuses on nonsmooth composite optimization problems, encompa
Externí odkaz:
http://arxiv.org/abs/2206.14981
Autor:
Zhu, Daoli, Zhao, Lei
Linear constrained convex programming has many practical applications, including support vector machine and machine learning portfolio problems. We propose the randomized primal-dual coordinate (RPDC) method, a randomized coordinate extension of the
Externí odkaz:
http://arxiv.org/abs/2008.12946
In this work, we develop a level-set subdifferential error bound condition aiming towards convergence rate analysis of a variable Bregman proximal gradient (VBPG) method for a broad class of nonsmooth and nonconvex optimization problems. It is proved
Externí odkaz:
http://arxiv.org/abs/2008.13627
Nonlinearly constrained nonconvex and nonsmooth optimization models play an increasingly important role in machine learning, statistics and data analytics. In this paper, based on the augmented Lagrangian function we introduce a flexible first-order
Externí odkaz:
http://arxiv.org/abs/2007.12219
Autor:
Zhao, Lei, Zhu, Daoli
Large-scale nonconvex and nonsmooth problems have attracted considerable attention in the fields of compress sensing, big data optimization and machine learning. Exploring effective methods is still the main challenge of today's research. Stochastic
Externí odkaz:
http://arxiv.org/abs/1905.10926
Autor:
Zhu, Daoli, Deng, Sien
We develop a new variational approach on level sets aiming towards convergence rate analysis of a variable Bregman proximal gradient (VBPG) method for a broad class of nonsmooth and nonconvex optimization problems. With this new approach, we are able
Externí odkaz:
http://arxiv.org/abs/1905.08445
Autor:
Zhu, Daoli, Zhao, Lei
We introduce a stochastic coordinate extension of the first-order primal-dual method studied by Cohen and Zhu (1984) and Zhao and Zhu (2018) to solve Composite Optimization with Composite Cone-constraints (COCC). In this method, we randomly choose a
Externí odkaz:
http://arxiv.org/abs/1905.01020