Zobrazeno 1 - 10
of 70
pro vyhledávání: '"Chen, Xinchi"'
Autor:
Chen, Jifan, Zhang, Yuhao, Liu, Lan, Dong, Rui, Chen, Xinchi, Ng, Patrick, Wang, William Yang, Huang, Zhiheng
There has been great progress in unifying various table-to-text tasks using a single encoder-decoder model trained via multi-task learning (Xie et al., 2022). However, existing methods typically encode task information with a simple dataset name as a
Externí odkaz:
http://arxiv.org/abs/2212.08780
Autor:
Hu, Xiyang, Chen, Xinchi, Qi, Peng, Kong, Deguang, Liu, Kunlun, Wang, William Yang, Huang, Zhiheng
Multilingual information retrieval (IR) is challenging since annotated training data is costly to obtain in many languages. We present an effective method to train multilingual IR systems when only English IR training data and some parallel corpora b
Externí odkaz:
http://arxiv.org/abs/2210.06633
Autor:
Ribeiro, Danilo, Wang, Shen, Ma, Xiaofei, Dong, Rui, Wei, Xiaokai, Zhu, Henry, Chen, Xinchi, Huang, Zhiheng, Xu, Peng, Arnold, Andrew, Roth, Dan
Large language models have achieved high performance on various question answering (QA) benchmarks, but the explainability of their output remains elusive. Structured explanations, called entailment trees, were recently suggested as a way to explain
Externí odkaz:
http://arxiv.org/abs/2205.09224
Autor:
Yu, Peng-Cheng, Hou, Dan, Chang, Binhe, Liu, Na, Xu, Chun-Hui, Chen, Xinchi, Hu, Cheng-Long, Liu, Ting, Wang, Xiaoning, Zhang, Qunling, Liu, Ping, Jiang, Yilun, Fei, Ming-Yue, Zong, Li-Juan, Zhang, Jia-Ying, Liu, Hui, Chen, Bing-Yi, Chen, Shu-Bei, Wang, Yong, Li, Zi-Juan, Li, Xiya, Deng, Chu-Han, Ren, Yi-Yi, Zhao, Muying, Jiang, Shiyu, Wang, Roujia, Jin, Jiacheng, Yang, Shaoxin, Xue, Kai, Shi, Jun, Chang, Chun-Kang, Shen, Shuhong, Wang, Zhikai, He, Peng-Cheng, Chen, Zhu, Chen, Sai-Juan, Sun, Xiao-Jian, Wang, Lan
Publikováno v:
In Developmental Cell 5 August 2024 59(15):1954-1971
Recent progress in pretrained Transformer-based language models has shown great success in learning contextual representation of text. However, due to the quadratic self-attention complexity, most of the pretrained Transformers models can only handle
Externí odkaz:
http://arxiv.org/abs/2110.10778
Publikováno v:
In Journal of Hydrology May 2024 635
Semantic role labeling (SRL) involves extracting propositions (i.e. predicates and their typed arguments) from natural language sentences. State-of-the-art SRL models rely on powerful encoders (e.g., LSTMs) and do not model non-local interaction betw
Externí odkaz:
http://arxiv.org/abs/1910.03136
Publikováno v:
In Journal of Hydrology: Regional Studies June 2023 47
Multi-criteria Chinese word segmentation is a promising but challenging task, which exploits several different segmentation criteria and mines their common underlying knowledge. In this paper, we propose a flexible multi-criteria learning for Chinese
Externí odkaz:
http://arxiv.org/abs/1812.08033
Designing shared neural architecture plays an important role in multi-task learning. The challenge is that finding an optimal sharing scheme heavily relies on the expert knowledge and is not scalable to a large number of diverse tasks. Inspired by th
Externí odkaz:
http://arxiv.org/abs/1808.07658