Zobrazeno 1 - 10
of 265
pro vyhledávání: '"Salzo, A."'
In this paper, we provide novel optimal (or near optimal) convergence rates in expectation for the last iterate of a clipped version of the stochastic subgradient method. We consider nonsmooth convex problems, over possibly unbounded domains, under h
Externí odkaz:
http://arxiv.org/abs/2410.00573
We study the problem of efficiently computing the derivative of the fixed-point of a parametric nondifferentiable contraction map. This problem has wide applications in machine learning, including hyperparameter optimization, meta-learning and data p
Externí odkaz:
http://arxiv.org/abs/2403.11687
Autor:
Roberta De Dona, Manuela Tamburro, Carmen Adesso, Angelo Salzo, Antonio D’Amico, Nicandro Samprati, Arturo Santagata, Michela Anna Di Palma, Anna Natale, Fabio Cannizzaro, Vittorio Viccione, Giancarlo Ripabelli
Publikováno v:
COVID, Vol 4, Iss 10, Pp 1631-1641 (2024)
The Italian sporting event ‘XIV Convittiadi’ involving students at boarding schools took place in Molise region, central Italy, in April 2022. The study describes the public health protocol with specific countermeasures developed for the event, i
Externí odkaz:
https://doaj.org/article/86b68b89a71b46709f63dfc5b9ff8d79
In recent years, bilevel approaches have become very popular to efficiently estimate high-dimensional hyperparameters of machine learning models. However, to date, binary parameters are handled by continuous relaxation and rounding strategies, which
Externí odkaz:
http://arxiv.org/abs/2308.10711
In the context of finite sums minimization, variance reduction techniques are widely used to improve the performance of state-of-the-art stochastic gradient methods. Their practical impact is clear, as well as their theoretical properties. Stochastic
Externí odkaz:
http://arxiv.org/abs/2308.09310
In this work we study high probability bounds for stochastic subgradient methods under heavy tailed noise. In this setting the noise is only assumed to have finite variance as opposed to a sub-Gaussian distribution for which it is known that standard
Externí odkaz:
http://arxiv.org/abs/2208.08567
Publikováno v:
Journal of Machine Learning Research, volume 24, number 167, pages 1-37, year 2023
We analyse a general class of bilevel problems, in which the upper-level problem consists in the minimization of a smooth objective function and the lower-level problem is to find the fixed point of a smooth contraction map. This type of problems inc
Externí odkaz:
http://arxiv.org/abs/2202.03397
In this paper, we study the convergence properties of a randomized block-coordinate descent algorithm for the minimization of a composite convex objective function, where the block-coordinates are updated asynchronously and randomly according to an a
Externí odkaz:
http://arxiv.org/abs/2201.05498
In this work we propose a batch version of the Greenkhorn algorithm for multimarginal regularized optimal transport problems. Our framework is general enough to cover, as particular cases, some existing algorithms like Sinkhorn and Greenkhorn algorit
Externí odkaz:
http://arxiv.org/abs/2112.00838
Autor:
Kostic, Vladimir, Salzo, Saverio
In this work we study the method of Bregman projections for deterministic and stochastic convex feasibility problems with three types of control sequences for the selection of sets during the algorithmic procedure: greedy, random, and adaptive random
Externí odkaz:
http://arxiv.org/abs/2101.01704