Zobrazeno 1 - 10
of 12
pro vyhledávání: '"Zewen Chi"'
Publikováno v:
IEEE/ACM Transactions on Audio, Speech, and Language Processing. :1-14
Publikováno v:
Information Sciences. 584:170-183
Food recommendation has attracted increasing attentions to various food-related applications and services. The food recommender models aim to match users’ preferences with recipes, where the key lies in the representation learning of users and reci
Publikováno v:
2021 IEEE International Conference on Big Knowledge (ICBK).
Publikováno v:
ACL/IJCNLP (1)
The cross-lingual language models are typically pretrained with masked language modeling on multilingual text or parallel sentences. In this paper, we introduce denoising word alignment as a new cross-lingual pre-training task. Specifically, the mode
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::2ebe4268dc7b85b80a39acadf5065630
http://arxiv.org/abs/2106.06381
http://arxiv.org/abs/2106.06381
Publikováno v:
Natural Language Processing and Chinese Computing ISBN: 9783030884826
NLPCC (2)
NLPCC (2)
Recently, open-domain dialogue systems have attracted growing attention. Most of them use the sequence-to-sequence (Seq2Seq) architecture to generate dialogue responses. However, traditional Seq2Seq-based open-domain dialogue models tend to generate
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_________::a50b06e78eeadd703e93c91146df49a0
https://doi.org/10.1007/978-3-030-88483-3_14
https://doi.org/10.1007/978-3-030-88483-3_14
Autor:
Zewen Chi, Li Dong, Shuming Ma, Shaohan Huang, Saksham Singhal, Xian-Ling Mao, Heyan Huang, Xia Song, Furu Wei
Publikováno v:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing.
Autor:
Bo Zheng, Furu Wei, Zewen Chi, Ting Liu, Shaohan Huang, Saksham Singhal, Wanxiang Che, Xia Song, Li Dong, Wenhui Wang
Publikováno v:
ACL/IJCNLP (1)
Fine-tuning pre-trained cross-lingual language models can transfer task-specific supervision from one language to the others. In this work, we propose to improve cross-lingual fine-tuning with consistency regularization. Specifically, we use example
Publikováno v:
2020 IEEE International Conference on Knowledge Graph (ICKG).
Recently, it has attracted much attention to build reliable named entity recognition (NER) systems using limited annotated data. Nearly all existing works heavily rely on domain-specific resources, such as external lexicons and knowledge bases. Howev
Autor:
Li Dong, Furu Wei, Saksham Singhal, Ming Zhou, Xia Song, Xian-Ling Mao, Nan Yang, Zewen Chi, Wenhui Wang, Heyan Huang
Publikováno v:
NAACL-HLT
In this work, we present an information-theoretic framework that formulates cross-lingual language model pre-training as maximizing mutual information between multilingual-multi-granularity texts. The unified view helps us to better understand the ex
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::47a85cb4b4566dd3fe2362a22af50d47
Publikováno v:
AAAI
In this work we focus on transferring supervision signals of natural language generation (NLG) tasks between multiple languages. We propose to pretrain the encoder and the decoder of a sequence-to-sequence model under both monolingual and cross-lingu
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::fe8aa83b0bad264114454cde11583589