Zobrazeno 1 - 10
of 106
pro vyhledávání: '"Yang, Zhibang"'
Parameter-efficient fine-tuning methods, represented by LoRA, play an essential role in adapting large-scale pre-trained models to downstream tasks. However, fine-tuning LoRA-series models also faces the risk of overfitting on the training dataset, a
Externí odkaz:
http://arxiv.org/abs/2404.09610
With the increasingly powerful performances and enormous scales of pretrained models, promoting parameter efficiency in fine-tuning has become a crucial need for effective and efficient adaptation to various downstream tasks. One representative line
Externí odkaz:
http://arxiv.org/abs/2404.04316
Semi-supervised learning (SSL) has achieved great success in leveraging a large amount of unlabeled data to learn a promising classifier. A popular approach is pseudo-labeling that generates pseudo labels only for those unlabeled data with high-confi
Externí odkaz:
http://arxiv.org/abs/2212.06643
Publikováno v:
In Journal of Systems Architecture August 2024 153
Publikováno v:
In Neural Networks February 2024 170:417-426
Publikováno v:
In Future Generation Computer Systems April 2023 141:768-776
Publikováno v:
In Future Generation Computer Systems April 2023 141:106-115
Publikováno v:
In Information Sciences April 2022 589:376-394
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Publikováno v:
In Neurocomputing 7 June 2021 439:96-105