Zobrazeno 1 - 4
of 4
pro vyhledávání: '"Lu, Dakuan"'
Autor:
Xu, Rui, Lu, Dakuan, Tan, Xiaoyu, Wang, Xintao, Yuan, Siyu, Chen, Jiangjie, Chu, Wei, Xu, Yinghui
Large language models~(LLMs) have demonstrated impressive performance in various applications, among which role-playing language agents (RPLAs) have engaged a broad user base. Now, there is a growing demand for RPLAs that represent Key Opinion Leader
Externí odkaz:
http://arxiv.org/abs/2407.05305
Pre-trained language models (PLMs) have been prevailing in state-of-the-art methods for natural language processing, and knowledge-enhanced PLMs are further proposed to promote model performance in knowledge-intensive tasks. However, conceptual knowl
Externí odkaz:
http://arxiv.org/abs/2401.05669
Autor:
Xu, Yipei, Lu, Dakuan, Liang, Jiaqing, Wang, Xintao, Geng, Yipeng, Xin, Yingsi, Wu, Hengkui, Chen, Ken, zhang, ruiji, Xiao, Yanghua
Pre-trained language models (PLMs) have established the new paradigm in the field of NLP. For more powerful PLMs, one of the most popular and successful way is to continuously scale up sizes of the models and the pre-training corpora. These large cor
Externí odkaz:
http://arxiv.org/abs/2311.09732
Autor:
Lu, Dakuan, Wu, Hengkui, Liang, Jiaqing, Xu, Yipei, He, Qianyu, Geng, Yipeng, Han, Mengkun, Xin, Yingsi, Xiao, Yanghua
To advance Chinese financial natural language processing (NLP), we introduce BBT-FinT5, a new Chinese financial pre-training language model based on the T5 model. To support this effort, we have built BBT-FinCorpus, a large-scale financial corpus wit
Externí odkaz:
http://arxiv.org/abs/2302.09432