Zobrazeno 1 - 10
of 113
pro vyhledávání: '"Li, Zhongnian"'
Weakly supervised learning has recently achieved considerable success in reducing annotation costs and label noise. Unfortunately, existing weakly supervised learning methods are short of ability in generating reliable labels via pre-trained vision-l
Externí odkaz:
http://arxiv.org/abs/2405.15228
In multi-label classification, each training instance is associated with multiple class labels simultaneously. Unfortunately, collecting the fully precise class labels for each training instance is time- and labor-consuming for real-world application
Externí odkaz:
http://arxiv.org/abs/2403.16482
Long-tailed data is prevalent in real-world classification tasks and heavily relies on supervised information, which makes the annotation process exceptionally labor-intensive and time-consuming. Unfortunately, despite being a common approach to miti
Externí odkaz:
http://arxiv.org/abs/2403.16469
Multi-abel Learning (MLL) often involves the assignment of multiple relevant labels to each instance, which can lead to the leakage of sensitive information (such as smoking, diseases, etc.) about the instances. However, existing MLL suffer from fail
Externí odkaz:
http://arxiv.org/abs/2312.13312
Annotating multi-class instances is a crucial task in the field of machine learning. Unfortunately, identifying the correct class label from a long sequence of candidate labels is time-consuming and laborious. To alleviate this problem, we design a n
Externí odkaz:
http://arxiv.org/abs/2302.00299
Publikováno v:
Electronic Research Archive. 2024, Vol. 32 Issue 10, p1-15. 15p.
Complementary Labels Learning (CLL) arises in many real-world tasks such as private questions classification and online learning, which aims to alleviate the annotation cost compared with standard supervised learning. Unfortunately, most previous CLL
Externí odkaz:
http://arxiv.org/abs/2211.10701
Publikováno v:
Neural Networks Volume 166, September 2023, Pages 555-565
Complementary-label learning (CLL) is widely used in weakly supervised classification, but it faces a significant challenge in real-world datasets when confronted with class-imbalanced training samples. In such scenarios, the number of samples in one
Externí odkaz:
http://arxiv.org/abs/2209.14189
Positive Unlabeled (PU) learning aims to learn a binary classifier from only positive and unlabeled data, which is utilized in many real-world scenarios. However, existing PU learning algorithms cannot deal with the real-world challenge in an open an
Externí odkaz:
http://arxiv.org/abs/2207.13274
Adversarial training (AT) has shown excellent high performance in defending against adversarial examples. Recent studies demonstrate that examples are not equally important to the final robustness of models during AT, that is, the so-called hard exam
Externí odkaz:
http://arxiv.org/abs/2206.12292