Zobrazeno 1 - 10
of 139
pro vyhledávání: '"Sriperumbudur, Bharath"'
Autor:
Chen, Zonghao, Mustafi, Aratrika, Glaser, Pierre, Korba, Anna, Gretton, Arthur, Sriperumbudur, Bharath K.
We introduce a (de)-regularization of the Maximum Mean Discrepancy (DrMMD) and its Wasserstein gradient flow. Existing gradient flows that transport samples from source distribution to target distribution with only target samples, either lack tractab
Externí odkaz:
http://arxiv.org/abs/2409.14980
Autor:
Zhang, Zhengxin, Goldfeld, Ziv, Greenewald, Kristjan, Mroueh, Youssef, Sriperumbudur, Bharath K.
The Wasserstein space of probability measures is known for its intricate Riemannian structure, which underpins the Wasserstein geometry and enables gradient flow algorithms. However, the Wasserstein geometry may not be suitable for certain tasks or d
Externí odkaz:
http://arxiv.org/abs/2407.11800
Functional linear regression is one of the fundamental and well-studied methods in functional data analysis. In this work, we investigate the functional linear regression model within the context of reproducing kernel Hilbert space by employing gener
Externí odkaz:
http://arxiv.org/abs/2406.10005
Kernel methods underpin many of the most successful approaches in data science and statistics, and they allow representing probability measures as elements of a reproducing kernel Hilbert space without loss of information. Recently, the kernel Stein
Externí odkaz:
http://arxiv.org/abs/2406.08401
We explore the minimax optimality of goodness-of-fit tests on general domains using the kernelized Stein discrepancy (KSD). The KSD framework offers a flexible approach for goodness-of-fit testing, avoiding strong distributional assumptions, accommod
Externí odkaz:
http://arxiv.org/abs/2404.08278
In this paper, we discuss the convergence analysis of the conjugate gradient-based algorithm for the functional linear model in the reproducing kernel Hilbert space framework, utilizing early stopping results in regularization against over-fitting. W
Externí odkaz:
http://arxiv.org/abs/2310.02607
Maximum mean discrepancy (MMD) has enjoyed a lot of success in many machine learning and statistical applications, including non-parametric hypothesis testing, because of its ability to handle non-Euclidean data. Recently, it has been demonstrated in
Externí odkaz:
http://arxiv.org/abs/2308.04561
We consider a kernelized version of the $\epsilon$-greedy strategy for contextual bandits. More precisely, in a setting with finitely many arms, we consider that the mean reward functions lie in a reproducing kernel Hilbert space (RKHS). We propose a
Externí odkaz:
http://arxiv.org/abs/2306.17329
The Gromov-Wasserstein (GW) distance, rooted in optimal transport (OT) theory, quantifies dissimilarity between metric measure spaces and provides a framework for aligning heterogeneous datasets. While computational aspects of the GW problem have bee
Externí odkaz:
http://arxiv.org/abs/2212.12848
Over the last decade, an approach that has gained a lot of popularity to tackle nonparametric testing problems on general (i.e., non-Euclidean) domains is based on the notion of reproducing kernel Hilbert space (RKHS) embedding of probability distrib
Externí odkaz:
http://arxiv.org/abs/2212.09201