Zobrazeno 1 - 10
of 617
pro vyhledávání: '"Suykens, Johan A. K."'
Autor:
Tao, Qinghua, Tonin, Francesco, Lambert, Alex, Chen, Yingyi, Patrinos, Panagiotis, Suykens, Johan A. K.
Publikováno v:
the 41st International Conference on Machine Learning (ICML), 2024
In contrast with Mercer kernel-based approaches as used e.g., in Kernel Principal Component Analysis (KPCA), it was previously shown that Singular Value Decomposition (SVD) inherently relates to asymmetric kernels and Asymmetric Kernel Singular Value
Externí odkaz:
http://arxiv.org/abs/2406.08748
Ridgeless regression has garnered attention among researchers, particularly in light of the ``Benign Overfitting'' phenomenon, where models interpolating noisy samples demonstrate robust generalization. However, kernel ridgeless regression does not a
Externí odkaz:
http://arxiv.org/abs/2406.01435
Clustering nodes in heterophilous graphs presents unique challenges due to the asymmetric relationships often overlooked by traditional methods, which moreover assume that good clustering corresponds to high intra-cluster and low inter-cluster connec
Externí odkaz:
http://arxiv.org/abs/2405.17050
For the linear inverse problem with sparsity constraints, the $l_0$ regularized problem is NP-hard, and existing approaches either utilize greedy algorithms to find almost-optimal solutions or to approximate the $l_0$ regularization with its convex c
Externí odkaz:
http://arxiv.org/abs/2402.08493
While the great capability of Transformers significantly boosts prediction accuracy, it could also yield overconfident predictions and require calibrated uncertainty estimation, which can be commonly tackled by Gaussian processes (GPs). Existing work
Externí odkaz:
http://arxiv.org/abs/2402.01476
Adversarial training is a widely used method to improve the robustness of deep neural networks (DNNs) over adversarial perturbations. However, it is empirically observed that adversarial training on over-parameterized networks often suffers from the
Externí odkaz:
http://arxiv.org/abs/2401.13624
With the rapid development of deep learning in various fields of science and technology, such as speech recognition, image classification, and natural language processing, recently it is also widely applied in the functional data analysis (FDA) with
Externí odkaz:
http://arxiv.org/abs/2401.02890
The lack of sufficient flexibility is the key bottleneck of kernel-based learning that relies on manually designed, pre-given, and non-trainable kernels. To enhance kernel flexibility, this paper introduces the concept of Locally-Adaptive-Bandwidths
Externí odkaz:
http://arxiv.org/abs/2310.05236
Multitask learning (MTL) leverages task-relatedness to enhance performance. With the emergence of multimodal data, tasks can now be referenced by multiple indices. In this paper, we employ high-order tensors, with each mode corresponding to a task in
Externí odkaz:
http://arxiv.org/abs/2308.16056
Autor:
De Plaen, Henri, Suykens, Johan A. K.
In this paper, we characterize Probabilistic Principal Component Analysis in Hilbert spaces and demonstrate how the optimal solution admits a representation in dual space. This allows us to develop a generative framework for kernel methods. Furthermo
Externí odkaz:
http://arxiv.org/abs/2307.10078