Zobrazeno 1 - 10
of 205
pro vyhledávání: '"Christmann, Andreas"'
Autor:
Christmann, Andreas, Lei, Yunwen
In this paper some methods to use the empirical bootstrap approach for stochastic gradient descent (SGD) to minimize the empirical risk over a separable Hilbert space are investigated from the view point of algorithmic stability and statistical robus
Externí odkaz:
http://arxiv.org/abs/2409.01074
In this paper, we study an online learning algorithm with a robust loss function $\mathcal{L}_{\sigma}$ for regression over a reproducing kernel Hilbert space (RKHS). The loss function $\mathcal{L}_{\sigma}$ involving a scaling parameter $\sigma>0$ c
Externí odkaz:
http://arxiv.org/abs/2304.10060
Autor:
Guo, Zheng-Chu1 (AUTHOR), Christmann, Andreas2 (AUTHOR), Shi, Lei3 (AUTHOR) leishi@fudan.edu.cn
Publikováno v:
Foundations of Computational Mathematics. Oct2024, Vol. 24 Issue 5, p1455-1483. 29p.
Autor:
Köhler, Hannes, Christmann, Andreas
Regularized kernel-based methods such as support vector machines (SVMs) typically depend on the underlying probability measure $\mathrm{P}$ (respectively an empirical measure $\mathrm{D}_n$ in applications) as well as on the regularization parameter
Externí odkaz:
http://arxiv.org/abs/2101.12678
Autor:
Gensler, Patrick, Christmann, Andreas
It is shown that many results on the statistical robustness of kernel-based pairwise learning can be derived under basically no assumptions on the input and output spaces. In particular neither moment conditions on the conditional distribution of Y g
Externí odkaz:
http://arxiv.org/abs/2010.15527
Autor:
Ettich, Julia, Wittich, Christoph, Moll, Jens M., Behnke, Kristina, Floss, Doreen M., Reiners, Jens, Christmann, Andreas, Lang, Philipp A., Smits, Sander H.J., Kolmar, Harald, Scheller, Jürgen
Publikováno v:
In Journal of Biological Chemistry November 2023 299(11)
Regularized empirical risk minimization using kernels and their corresponding reproducing kernel Hilbert spaces (RKHSs) plays an important role in machine learning. However, the actually used kernel often depends on one or on a few hyperparameters or
Externí odkaz:
http://arxiv.org/abs/1709.07625
A short note on extension theorems and their connection to universal consistency in machine learning
Statistical machine learning plays an important role in modern statistics and computer science. One main goal of statistical machine learning is to provide universally consistent algorithms, i.e., the estimator converges in probability or in some str
Externí odkaz:
http://arxiv.org/abs/1604.04505
Autor:
Christmann, Andreas, Zhou, Ding-Xuan
Regularized empirical risk minimization including support vector machines plays an important role in machine learning theory. In this paper regularized pairwise learning (RPL) methods based on kernels will be investigated. One example is regularized
Externí odkaz:
http://arxiv.org/abs/1510.03267