Zobrazeno 1 - 10
of 65
pro vyhledávání: '"Zhong, YuTao"'
We present a detailed study of surrogate losses and algorithms for multi-label learning, supported by $H$-consistency bounds. We first show that, for the simplest form of multi-label loss (the popular Hamming loss), the well-known consistent binary r
Externí odkaz:
http://arxiv.org/abs/2407.13746
We present a comprehensive study of surrogate loss functions for learning to defer. We introduce a broad family of surrogate losses, parameterized by a non-increasing function $\Psi$, and establish their realizable $H$-consistency under mild conditio
Externí odkaz:
http://arxiv.org/abs/2407.13732
Recent research has introduced a key notion of $H$-consistency bounds for surrogate losses. These bounds offer finite-sample guarantees, quantifying the relationship between the zero-one estimation error (or other target loss) and the surrogate loss
Externí odkaz:
http://arxiv.org/abs/2407.13722
We present a detailed study of cardinality-aware top-$k$ classification, a novel approach that aims to learn an accurate top-$k$ set predictor while maintaining a low cardinality. We introduce a new target loss function tailored to this setting that
Externí odkaz:
http://arxiv.org/abs/2407.07140
This paper presents a comprehensive analysis of the growth rate of $H$-consistency bounds (and excess error bounds) for various surrogate losses used in classification. We prove a square-root growth rate near zero for smooth margin-based surrogate lo
Externí odkaz:
http://arxiv.org/abs/2405.05968
We present a detailed study of top-$k$ classification, the task of predicting the $k$ most probable classes for an input, extending beyond single-class prediction. We demonstrate that several prevalent surrogate loss functions in multi-class classifi
Externí odkaz:
http://arxiv.org/abs/2403.19625
Learning to defer with multiple experts is a framework where the learner can choose to defer the prediction to several experts. While this problem has received significant attention in classification contexts, it presents unique challenges in regress
Externí odkaz:
http://arxiv.org/abs/2403.19494
We present a detailed study of $H$-consistency bounds for regression. We first present new theorems that generalize the tools previously given to establish $H$-consistency bounds. This generalization proves essential for analyzing $H$-consistency bou
Externí odkaz:
http://arxiv.org/abs/2403.19480
We present a study of surrogate losses and algorithms for the general problem of learning to defer with multiple experts. We first introduce a new family of surrogate losses specifically tailored for the multiple-expert setting, where the prediction
Externí odkaz:
http://arxiv.org/abs/2310.14774
We study the key framework of learning with abstention in the multi-class classification setting. In this setting, the learner can choose to abstain from making a prediction with some pre-defined cost. We present a series of new theoretical and algor
Externí odkaz:
http://arxiv.org/abs/2310.14772