Zobrazeno 1 - 10
of 842
pro vyhledávání: '"Wang Xianfu"'
Level proximal subdifferential was introduced by Rockafellar recently as a tool for studying proximal mappings of possibly nonconvex functions. In this paper we give a systematic study of level proximal subdifferntial, characterize variational convex
Externí odkaz:
http://arxiv.org/abs/2406.00648
Monotone inclusion problems occur in many areas of optimization and variational analysis. Splitting methods, which utilize resolvents or proximal mappings of the underlying operators, are often applied to solve these problems. In 2022, Bredies, Chenc
Externí odkaz:
http://arxiv.org/abs/2307.09747
Autor:
Wang, Xianfu, Wang, Ziyuan
We propose a level proximal subdifferential for a proper lower semicontinuous function. Level proximal subdifferential is a uniform refinement of the well-known proximal subdifferential, and has the pleasant feature that its resolvent always coincide
Externí odkaz:
http://arxiv.org/abs/2303.02282
The solution of the cubic equation has a century-long history; however, the usual presentation is geared towards applications in algebra and is somewhat inconvenient to use in optimization where frequently the main interest lies in real roots. In thi
Externí odkaz:
http://arxiv.org/abs/2302.10731
Publikováno v:
J Optim Theory Appl (2024)
This work investigates a Bregman and inertial extension of the forward-reflected-backward algorithm [Y. Malitsky and M. Tam, SIAM J. Optim., 30 (2020), pp. 1451--1472] applied to structured nonconvex minimization problems under relative smoothness. T
Externí odkaz:
http://arxiv.org/abs/2212.01504
Autor:
Wang, Xianfu, Wang, Ziyuan
We propose a Bregman inertial forward-reflected-backward (BiFRB) method for nonconvex composite problems. Our analysis relies on a novel approach that imposes general conditions on implicit merit function parameters, which yields a stepsize condition
Externí odkaz:
http://arxiv.org/abs/2207.01170
The Fenchel-Young inequality is fundamental in Convex Analysis and Optimization. It states that the difference between certain function values of two vectors and their inner product is nonnegative. Recently, Carlier introduced a very nice sharpening
Externí odkaz:
http://arxiv.org/abs/2206.14872
Publikováno v:
Applied Set-Valued Analysis and Optimization 5 (2023), No. 2, pp. 163-180
In $\mathbb{R}^3$, a hyperbolic paraboloid is a classical saddle-shaped quadric surface. Recently, Elser has modeled problems arising in Deep Learning using rectangular hyperbolic paraboloids in $\mathbb{R}^n$. Motivated by his work, we provide a rig
Externí odkaz:
http://arxiv.org/abs/2206.04878
Publikováno v:
In International Journal of Pressure Vessels and Piping December 2024 212 Part B
Finding a zero of a sum of maximally monotone operators is a fundamental problem in modern optimization and nonsmooth analysis. Assuming that the resolvents of the operators are available, this problem can be tackled with the Douglas-Rachford algorit
Externí odkaz:
http://arxiv.org/abs/2203.03832