Zobrazeno 1 - 10
of 112
pro vyhledávání: '"Deligiannidis, George"'
Score-matching generative models have proven successful at sampling from complex high-dimensional data distributions. In many applications, this distribution is believed to concentrate on a much lower $d$-dimensional manifold embedded into $D$-dimens
Externí odkaz:
http://arxiv.org/abs/2410.09046
Denoising Diffusion Probabilistic Models (DDPM) are powerful state-of-the-art methods used to generate synthetic data from high-dimensional data distributions and are widely used for image, audio and video generation as well as many more applications
Externí odkaz:
http://arxiv.org/abs/2409.18804
Within the field of optimal transport (OT), the choice of ground cost is crucial to ensuring that the optimality of a transport map corresponds to usefulness in real-world applications. It is therefore desirable to use known information to tailor cos
Externí odkaz:
http://arxiv.org/abs/2406.08399
We propose data-dependent uniform generalization bounds by approaching the problem from a PAC-Bayesian perspective. We first apply the PAC-Bayesian framework on `random sets' in a rigorous way, where the training algorithm is assumed to output a data
Externí odkaz:
http://arxiv.org/abs/2404.17442
Autor:
Phillips, Angus, Dau, Hai-Dang, Hutchinson, Michael John, De Bortoli, Valentin, Deligiannidis, George, Doucet, Arnaud
Denoising diffusion models have become ubiquitous for generative modeling. The core idea is to transport the data distribution to a Gaussian by using a diffusion. Approximate samples from the data distribution are then obtained by estimating the time
Externí odkaz:
http://arxiv.org/abs/2402.06320
Denoising diffusions are a powerful method to generate approximate samples from high-dimensional data distributions. Recent results provide polynomial bounds on their convergence rate, assuming $L^2$-accurate scores. Until now, the tightest bounds we
Externí odkaz:
http://arxiv.org/abs/2308.03686
While conformal predictors reap the benefits of rigorous statistical guarantees on their error frequency, the size of their corresponding prediction sets is critical to their practical utility. Unfortunately, there is currently a lack of finite-sampl
Externí odkaz:
http://arxiv.org/abs/2306.07254
Autor:
Williams, Christopher, Falck, Fabian, Deligiannidis, George, Holmes, Chris, Doucet, Arnaud, Syed, Saifuddin
U-Nets are a go-to, state-of-the-art neural architecture across numerous tasks for continuous signals on a square such as images and Partial Differential Equations (PDE), however their design and architecture is understudied. In this paper, we provid
Externí odkaz:
http://arxiv.org/abs/2305.19638
Score-based generative models are a popular class of generative modelling techniques relying on stochastic differential equations (SDE). From their inception, it was realized that it was also possible to perform generation using ordinary differential
Externí odkaz:
http://arxiv.org/abs/2305.16860
Publikováno v:
International Conference on Machine Learning (ICML 2023)
Providing generalization guarantees for modern neural networks has been a crucial task in statistical learning. Recently, several studies have attempted to analyze the generalization error in such settings by using tools from fractal geometry. While
Externí odkaz:
http://arxiv.org/abs/2302.02766