Zobrazeno 1 - 10
of 11
pro vyhledávání: '"Nadav Hallak"'
Autor:
Amir Beck, Nadav Hallak
Publikováno v:
Operations Research Letters. 50:517-523
Autor:
Nadav Hallak, Marc Teboulle
Publikováno v:
Mathematics of Operations Research.
This paper develops a novel adaptive, augmented, Lagrangian-based method to address the comprehensive class of nonsmooth, nonconvex models with a nonlinear, functional composite structure in the objective. The proposed method uses an adaptive mechani
Publikováno v:
Journal of Optimization Theory and Applications. 193:324-353
This paper studies the minimization of a broad class of nonsmooth nonconvex objective functions subject to nonlinear functional equality constraints, where the gradients of the differentiable parts in the objective and the constraints are only locall
Autor:
Marc Teboulle, Nadav Hallak
Publikováno v:
Journal of Optimization Theory and Applications. 186:480-503
This paper introduces a method for computing points satisfying the second-order necessary optimality conditions for nonconvex minimization problems subject to a closed and convex constraint set. The method comprises two independent steps correspondin
Autor:
Nadav Hallak, Amir Beck
Publikováno v:
SIAM Journal on Optimization. 30:56-79
This paper studies the class of nonsmooth nonconvex problems in which the difference between a continuously differentiable function and a convex nonsmooth function is minimized over linear constrai...
Autor:
Marc Teboulle, Nadav Hallak
Publikováno v:
Operations Research Letters. 47:421-426
We propose a method that incorporates a non-Euclidean gradient descent step with a generic matrix sketching procedure, for solving unconstrained, nonconvex, matrix optimization problems, in which the decision variable cannot be saved in memory due to
Publikováno v:
NeurIPS 2020-34th International Conference on Neural Information Processing Systems
NeurIPS 2020-34th International Conference on Neural Information Processing Systems, Dec 2020, Vancouver, Canada. pp.1-32
Scopus-Elsevier
NeurIPS 2020-34th International Conference on Neural Information Processing Systems, Dec 2020, Vancouver, Canada. pp.1-32
Scopus-Elsevier
This paper analyzes the trajectories of stochastic gradient descent (SGD) to help understand the algorithm's convergence properties in non-convex problems. We first show that the sequence of iterates generated by SGD remains bounded and converges wit
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::a134aebe5ae854b4fe534d2ae62ec9bd
https://hal.inria.fr/hal-03043771/document
https://hal.inria.fr/hal-03043771/document
Publikováno v:
Scopus-Elsevier
We demonstrate two new important properties of the 1-path-norm of shallow neural networks. First, despite its non-smoothness and non-convexity it allows a closed form proximal operator which can be efficiently computed, allowing the use of stochastic
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::13031b07a0cd4b620a07e01787669cd7
http://arxiv.org/abs/2007.01003
http://arxiv.org/abs/2007.01003
Autor:
Amir Beck, Nadav Hallak
Publikováno v:
Mathematical Programming. 178:39-67
This paper studies a general form problem in which a lower bounded continuously differentiable function is minimized over a block separable set incorporating a group sparsity expression as a constraint or a penalty (or both) in the group sparsity set
Autor:
Amir Beck, Nadav Hallak
Publikováno v:
SIAM Journal on Optimization. 28:496-527
This paper studies a class of problems consisting of minimizing a continuously differentiable function penalized with the so-called $\ell_0$-norm over a symmetric set. These problems are hard to solve, yet prominent in many fields and applications. W