Zobrazeno 1 - 9
of 9
pro vyhledávání: '"Høgsgaard, Mikael Møller"'
Multiclass learnability is known to exhibit a properness barrier: there are learnable classes which cannot be learned by any proper learner. Binary classification faces no such barrier for learnability, but a similar one for optimal learning, which c
Externí odkaz:
http://arxiv.org/abs/2410.22749
Boosting is an extremely successful idea, allowing one to combine multiple low accuracy classifiers into a much more accurate voting classifier. In this work, we present a new and surprisingly simple Boosting algorithm that obtains a provably optimal
Externí odkaz:
http://arxiv.org/abs/2408.17148
Recent works on the parallel complexity of Boosting have established strong lower bounds on the tradeoff between the number of training rounds $p$ and the total parallel work per round $t$. These works have also presented highly non-trivial parallel
Externí odkaz:
http://arxiv.org/abs/2408.16653
Developing an optimal PAC learning algorithm in the realizable setting, where empirical risk minimization (ERM) is suboptimal, was a major open problem in learning theory for decades. The problem was finally resolved by Hanneke a few years ago. Unfor
Externí odkaz:
http://arxiv.org/abs/2403.08831
Autor:
Høgsgaard, Mikael Møller, Kamma, Lion, Larsen, Kasper Green, Nelson, Jelani, Schwiegelshohn, Chris
The sparse Johnson-Lindenstrauss transform is one of the central techniques in dimensionality reduction. It supports embedding a set of $n$ points in $\mathbb{R}^d$ into $m=O(\varepsilon^{-2} \lg n)$ dimensions while preserving all pairwise distances
Externí odkaz:
http://arxiv.org/abs/2302.06165
Autor:
Høgsgaard, Mikael Møller, Karras, Panagiotis, Ma, Wenyue, Rathi, Nidhi, Schwiegelshohn, Chris
For the fundamental problem of allocating a set of resources among individuals with varied preferences, the quality of an allocation relates to the degree of fairness and the collective welfare achieved. Unfortunately, in many resource-allocation set
Externí odkaz:
http://arxiv.org/abs/2302.03071
AdaBoost is a classic boosting algorithm for combining multiple inaccurate classifiers produced by a weak learner, to produce a strong learner with arbitrarily high accuracy when given enough training data. Determining the optimal number of samples n
Externí odkaz:
http://arxiv.org/abs/2301.11571
The Johnson-Lindenstrauss transform allows one to embed a dataset of $n$ points in $\mathbb{R}^d$ into $\mathbb{R}^m,$ while preserving the pairwise distance between any pair of points up to a factor $(1 \pm \varepsilon)$, provided that $m = \Omega(\
Externí odkaz:
http://arxiv.org/abs/2207.03304
The seminal Fast Johnson-Lindenstrauss (Fast JL) transform by Ailon and Chazelle (SICOMP'09) embeds a set of $n$ points in $d$-dimensional Euclidean space into optimal $k=O(\varepsilon^{-2} \ln n)$ dimensions, while preserving all pairwise distances
Externí odkaz:
http://arxiv.org/abs/2204.01800