Zobrazeno 1 - 2
of 2
pro vyhledávání: '"Gui, Yuntao"'
Transformer-based large language models (e.g., BERT and GPT) achieve great success, and fine-tuning, which tunes a pre-trained model on a task-specific dataset, is the standard practice to utilize these models for downstream tasks. However, Transform
Externí odkaz:
http://arxiv.org/abs/2312.10365
Autor:
Liu, Zhongqi1 (AUTHOR) liuzhongqi886@163.com, Gui, Jinxin1 (AUTHOR), Yan, Yuntao1 (AUTHOR), Zhang, Haiqing1 (AUTHOR) hunanhongli@aliyun.com, He, Jiwai1 (AUTHOR) hunanhongli@aliyun.com
Publikováno v:
International Journal of Molecular Sciences. Jul2023, Vol. 24 Issue 14, p11527. 21p.