Zobrazeno 1 - 10
of 194
pro vyhledávání: '"Li Rumei"'
As large language models (LLMs) continue to advance in capability and influence, ensuring their security and preventing harmful outputs has become crucial. A promising approach to address these concerns involves training models to automatically gener
Externí odkaz:
http://arxiv.org/abs/2408.02632
Knowledge Base Question Answering (KBQA) aims to answer natural language questions with factual information such as entities and relations in KBs. However, traditional Pre-trained Language Models (PLMs) are directly pre-trained on large-scale natural
Externí odkaz:
http://arxiv.org/abs/2308.14436
Autor:
Li, Rumei1 (AUTHOR) 2220902206@cnu.edu.cn, Zhang, Liyan1,2 (AUTHOR) 2230902185@cnu.edu.cn, Wang, Zun1 (AUTHOR) lixiaojuan@cnu.edu.cn, Li, Xiaojuan1,2 (AUTHOR)
Publikováno v:
Sensors (14248220). Nov2024, Vol. 24 Issue 21, p7023. 20p.
Autor:
Song, Jian, Liang, Di, Li, Rumei, Li, Yuntao, Wang, Sirui, Peng, Minlong, Wu, Wei, Yu, Yongxin
Transformer-based pre-trained models like BERT have achieved great progress on Semantic Sentence Matching. Meanwhile, dependency prior knowledge has also shown general benefits in multiple NLP tasks. However, how to efficiently integrate dependency p
Externí odkaz:
http://arxiv.org/abs/2210.08471
Publikováno v:
In Computational Biology and Chemistry October 2024 112
Recently, prompt-based methods have achieved significant performance in few-shot learning scenarios by bridging the gap between language model pre-training and fine-tuning for downstream tasks. However, existing prompt templates are mostly designed f
Externí odkaz:
http://arxiv.org/abs/2203.03903
Publikováno v:
In Science of the Total Environment 10 April 2024 920
Learning high-quality sentence representations benefits a wide range of natural language processing tasks. Though BERT-based pre-trained language models achieve high performance on many downstream tasks, the native derived sentence representations ar
Externí odkaz:
http://arxiv.org/abs/2105.11741
Autor:
Li, Rumei, Lu, Bin, Li, Qiang, Hu, Ji, Huang, Yun, Wang, Yangang, Qin, Guijun, Zhang, Weiwei, Su, Qing, Zhu, Jun, Xu, Yancheng, Jiang, Hongwei, Wang, Xinjun, Zhang, Keqing, Yang, Yuzhi, Hu, Renming
Publikováno v:
In Primary Care Diabetes February 2024 18(1):97-103
Autor:
Jia, Xiaoyan, Li, Rumei, Zhu, Shuping, Bao, Aijuan, Liu, Xiaoxiao, Kong, Boyang, Hu, Jiahuan, Jin, Xiaojie, Kong, Weibao, Zhang, Ji, Wang, Junlong
Publikováno v:
In Carbohydrate Polymers 1 January 2024 323