Zobrazeno 1 - 10
of 148
pro vyhledávání: '"Rudi, Alessandro"'
A recent stream of structured learning approaches has improved the practical state of the art for a range of combinatorial optimization problems with complex objectives encountered in operations research. Such approaches train policies that chain a s
Externí odkaz:
http://arxiv.org/abs/2407.17200
We study a theoretical and algorithmic framework for structured prediction in the online learning setting. The problem of structured prediction, i.e. estimating function where the output space lacks a vectorial structure, is well studied in the liter
Externí odkaz:
http://arxiv.org/abs/2406.12366
Sequential Bayesian Filtering aims to estimate the current state distribution of a Hidden Markov Model, given the past observations. The problem is well-known to be intractable for most application domains, except in notable cases such as the tabular
Externí odkaz:
http://arxiv.org/abs/2402.09796
We present a novel approach to non-convex optimization with certificates, which handles smooth functions on the hypercube or on the torus. Unlike traditional methods that rely on algebraic properties, our algorithm exploits the regularity of the targ
Externí odkaz:
http://arxiv.org/abs/2306.14932
Autor:
Bonalli, Riccardo, Rudi, Alessandro
We propose a novel non-parametric learning paradigm for the identification of drift and diffusion coefficients of multi-dimensional non-linear stochastic differential equations, which relies upon discrete-time observations of the state. The key idea
Externí odkaz:
http://arxiv.org/abs/2305.15557
This paper deals with the problem of efficient sampling from a stochastic differential equation, given the drift function and the diffusion matrix. The proposed approach leverages a recent model for probabilities \cite{rudi2021psd} (the positive semi
Externí odkaz:
http://arxiv.org/abs/2303.17109
Handling an infinite number of inequality constraints in infinite-dimensional spaces occurs in many fields, from global optimization to optimal transport. These problems have been tackled individually in several previous articles through kernel Sum-O
Externí odkaz:
http://arxiv.org/abs/2301.06339
We propose and analyse a reduced-rank method for solving least-squares regression problems with infinite dimensional output. We derive learning bounds for our method, and study under which setting statistical performance is improved in comparison to
Externí odkaz:
http://arxiv.org/abs/2211.08958
Autor:
Bach, Francis, Rudi, Alessandro
Publikováno v:
SIAM Journal on Optimization, In press
We consider the unconstrained optimization of multivariate trigonometric polynomials by the sum-of-squares hierarchy of lower bounds. We first show a convergence rate of $O(1/s^2)$ for the relaxation with degree $s$ without any assumption on the trig
Externí odkaz:
http://arxiv.org/abs/2211.04889
The workhorse of machine learning is stochastic gradient descent. To access stochastic gradients, it is common to consider iteratively input/output pairs of a training dataset. Interestingly, it appears that one does not need full supervision to acce
Externí odkaz:
http://arxiv.org/abs/2205.13255