Zobrazeno 1 - 10
of 165
pro vyhledávání: '"Gadat, Sébastien"'
Autor:
Lalanne, Clément, Gadat, Sébastien
Publikováno v:
ICML 2024 - 41st International Conference on Machine Learning, Jul 2024, Vienna, Austria. 39 p
Fueled by the ever-increasing need for statistics that guarantee the privacy of their training sets, this article studies the centrally-private estimation of Sobolev-smooth densities of probability over the hypercube in dimension d. The contributions
Externí odkaz:
http://arxiv.org/abs/2409.10083
This article studies and solves the problem of optimal portfolio allocation with CV@R penalty when dealing with imperfectly simulated financial assets. We use a Stochastic biased Mirror Descent to find optimal resource allocation for a portfolio whos
Externí odkaz:
http://arxiv.org/abs/2402.11999
This paper presents a novel algorithm that leverages Stochastic Gradient Descent strategies in conjunction with Random Features to augment the scalability of Conic Particle Gradient Descent (CPGD) specifically tailored for solving sparse optimisation
Externí odkaz:
http://arxiv.org/abs/2312.05993
In this paper, we investigate a continuous time version of the Stochastic Langevin Monte Carlo method, introduced in [WT11], that incorporates a stochastic sampling step inside the traditional over-damped Langevin diffusion. This method is popular in
Externí odkaz:
http://arxiv.org/abs/2301.03077
We introduce a new second order stochastic algorithm to estimate the entropically regularized optimal transport cost between two probability measures. The source measure can be arbitrary chosen, either absolutely continuous or discrete, while the tar
Externí odkaz:
http://arxiv.org/abs/2107.05291
Autor:
Gadat, Sébastien, Gavra, Ioana
This paper studies some asymptotic properties of adaptive algorithms widely used in optimization and machine learning, and among them Adagrad and Rmsprop, which are involved in most of the blackbox deep learning algorithms. Our setup is the non-conve
Externí odkaz:
http://arxiv.org/abs/2012.05640
In this paper, we investigate the problem of computing Bayesian estimators using Langevin Monte-Carlo type approximation. The novelty of this paper is to consider together the statistical and numerical counterparts (in a general log-concave setting).
Externí odkaz:
http://arxiv.org/abs/2010.06420
Autor:
Costa, Manon, Gadat, Sébastien
In this work, we study a new recursive stochastic algorithm for the joint estimation of quantile and superquantile of an unknown distribution. The novelty of this algorithm is to use the Cesaro averaging of the quantile estimation inside the recursiv
Externí odkaz:
http://arxiv.org/abs/2009.13174
This paper is devoted to two different two-time-scale stochastic approximation algorithms for superquantile estimation. We shall investigate the asymptotic behavior of a Robbins-Monro estimator and its convexified version. Our main contribution is to
Externí odkaz:
http://arxiv.org/abs/2007.14659
This paper investigates the statistical estimation of a discrete mixing measure $\mu$0 involved in a kernel mixture model. Using some recent advances in l1-regularization over the space of measures, we introduce a "data fitting and regularization" co
Externí odkaz:
http://arxiv.org/abs/1907.10592