Zobrazeno 1 - 4
of 4
pro vyhledávání: '"Huang, Xinjing"'
Autor:
Ren, Xiaozhe, Zhou, Pingyi, Meng, Xinfan, Huang, Xinjing, Wang, Yadao, Wang, Weichao, Li, Pengfei, Zhang, Xiaoda, Podolskiy, Alexander, Arshinov, Grigory, Bout, Andrey, Piontkovskaya, Irina, Wei, Jiansheng, Jiang, Xin, Su, Teng, Liu, Qun, Yao, Jun
The scaling of large language models has greatly improved natural language understanding, generation, and reasoning. In this work, we develop a system that trained a trillion-parameter language model on a cluster of Ascend 910 AI processors and MindS
Externí odkaz:
http://arxiv.org/abs/2303.10845
Autor:
Zeng, Wei, Ren, Xiaozhe, Su, Teng, Wang, Hui, Liao, Yi, Wang, Zhiwei, Jiang, Xin, Yang, ZhenZhang, Wang, Kaisheng, Zhang, Xiaoda, Li, Chen, Gong, Ziyan, Yao, Yifan, Huang, Xinjing, Wang, Jun, Yu, Jianfeng, Guo, Qi, Yu, Yue, Zhang, Yan, Wang, Jin, Tao, Hengtao, Yan, Dasen, Yi, Zexuan, Peng, Fang, Jiang, Fangqing, Zhang, Han, Deng, Lingfeng, Zhang, Yehong, Lin, Zhe, Zhang, Chao, Zhang, Shaojie, Guo, Mingyue, Gu, Shanzhi, Fan, Gaojun, Wang, Yaowei, Jin, Xuefeng, Liu, Qun, Tian, Yonghong
Large-scale Pretrained Language Models (PLMs) have become the new paradigm for Natural Language Processing (NLP). PLMs with hundreds of billions parameters such as GPT-3 have demonstrated strong performances on natural language understanding and gene
Externí odkaz:
http://arxiv.org/abs/2104.12369
How to incorporate external knowledge into a neural dialogue model is critically important for dialogue systems to behave like real humans. To handle this problem, memory networks are usually a great choice and a promising way. However, existing memo
Externí odkaz:
http://arxiv.org/abs/1909.11287
Dialogue systems dealing with multi-domain tasks are highly required. How to record the state remains a key problem in a task-oriented dialogue system. Normally we use human-defined features as dialogue states and apply a state tracker to extract the
Externí odkaz:
http://arxiv.org/abs/1908.07137