Zobrazeno 1 - 10
of 165
pro vyhledávání: '"Gadat Sébastien"'
Autor:
Lalanne, Clément, Gadat, Sébastien
Publikováno v:
ICML 2024 - 41st International Conference on Machine Learning, Jul 2024, Vienna, Austria. 39 p
Fueled by the ever-increasing need for statistics that guarantee the privacy of their training sets, this article studies the centrally-private estimation of Sobolev-smooth densities of probability over the hypercube in dimension d. The contributions
Externí odkaz:
http://arxiv.org/abs/2409.10083
Publikováno v:
ESAIM: Proceedings and Surveys, Vol 75, Pp 1-1 (2023)
Externí odkaz:
https://doaj.org/article/d9cc93cef2794d48867ed4c28cf30e1b
This article studies and solves the problem of optimal portfolio allocation with CV@R penalty when dealing with imperfectly simulated financial assets. We use a Stochastic biased Mirror Descent to find optimal resource allocation for a portfolio whos
Externí odkaz:
http://arxiv.org/abs/2402.11999
This paper presents a novel algorithm that leverages Stochastic Gradient Descent strategies in conjunction with Random Features to augment the scalability of Conic Particle Gradient Descent (CPGD) specifically tailored for solving sparse optimisation
Externí odkaz:
http://arxiv.org/abs/2312.05993
Publikováno v:
ESAIM: Proceedings and Surveys, Vol 51, Pp 293-319 (2015)
This paper proposes to review some recent developments in Bayesian statistics for high dimensional data. After giving some brief motivations in a short introduction, we describe new advances in the understanding of Bayes posterior computation as we
Externí odkaz:
https://doaj.org/article/c720e91071314caca5a606734e839baa
In this paper, we investigate a continuous time version of the Stochastic Langevin Monte Carlo method, introduced in [WT11], that incorporates a stochastic sampling step inside the traditional over-damped Langevin diffusion. This method is popular in
Externí odkaz:
http://arxiv.org/abs/2301.03077
We introduce a new second order stochastic algorithm to estimate the entropically regularized optimal transport cost between two probability measures. The source measure can be arbitrary chosen, either absolutely continuous or discrete, while the tar
Externí odkaz:
http://arxiv.org/abs/2107.05291
Autor:
Gadat, Sébastien, Gavra, Ioana
This paper studies some asymptotic properties of adaptive algorithms widely used in optimization and machine learning, and among them Adagrad and Rmsprop, which are involved in most of the blackbox deep learning algorithms. Our setup is the non-conve
Externí odkaz:
http://arxiv.org/abs/2012.05640
In this paper, we investigate the problem of computing Bayesian estimators using Langevin Monte-Carlo type approximation. The novelty of this paper is to consider together the statistical and numerical counterparts (in a general log-concave setting).
Externí odkaz:
http://arxiv.org/abs/2010.06420
Autor:
Costa, Manon, Gadat, Sébastien
In this work, we study a new recursive stochastic algorithm for the joint estimation of quantile and superquantile of an unknown distribution. The novelty of this algorithm is to use the Cesaro averaging of the quantile estimation inside the recursiv
Externí odkaz:
http://arxiv.org/abs/2009.13174