Zobrazeno 1 - 10
of 45
pro vyhledávání: '"Zhu, Muhua"'
Previous approaches to the task of implicit discourse relation recognition (IDRR) generally view it as a classification task. Even with pre-trained language models, like BERT and RoBERTa, IDRR still relies on complicated neural networks with multiple
Externí odkaz:
http://arxiv.org/abs/2409.13716
Recent studies on Knowledge Base Question Answering (KBQA) have shown great progress on this task via better question understanding. Previous works for encoding questions mainly focus on the word sequences, but seldom consider the information from sy
Externí odkaz:
http://arxiv.org/abs/2107.07940
In the literature, the research on abstract meaning representation (AMR) parsing is much restricted by the size of human-curated dataset which is critical to build an AMR parser with good performance. To alleviate such data size restriction, pre-trai
Externí odkaz:
http://arxiv.org/abs/2010.01771
Autor:
Ding, Ning, Long, Dingkun, Xu, Guangwei, Zhu, Muhua, Xie, Pengjun, Wang, Xiaobin, Zheng, Hai-Tao
Fully supervised neural approaches have achieved significant progress in the task of Chinese word segmentation (CWS). Nevertheless, the performance of supervised models tends to drop dramatically when they are applied to out-of-domain data. Performan
Externí odkaz:
http://arxiv.org/abs/2007.08186
Recent studies on AMR-to-text generation often formalize the task as a sequence-to-sequence (seq2seq) learning problem by converting an Abstract Meaning Representation (AMR) graph into a word sequence. Graph structures are further modeled into the se
Externí odkaz:
http://arxiv.org/abs/1909.00136
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.
Autor:
Gong, Yu, Luo, Xusheng, Zhu, Yu, Ou, Wenwu, Li, Zhao, Zhu, Muhua, Zhu, Kenny Q., Duan, Lu, Chen, Xi
Slot filling is a critical task in natural language understanding (NLU) for dialog systems. State-of-the-art approaches treat it as a sequence labeling problem and adopt such models as BiLSTM-CRF. While these models work relatively well on standard b
Externí odkaz:
http://arxiv.org/abs/1803.11326
Autor:
Li, Junhui, Zhu, Muhua
In the past few years, attention mechanisms have become an indispensable component of end-to-end neural machine translation models. However, previous attention models always refer to some source words when predicting a target word, which contradicts
Externí odkaz:
http://arxiv.org/abs/1705.11160
Even though a linguistics-free sequence to sequence model in neural machine translation (NMT) has certain capability of implicitly learning syntactic information of source sentences, this paper shows that source syntax can be explicitly incorporated
Externí odkaz:
http://arxiv.org/abs/1705.01020
Akademický článek
Tento výsledek nelze pro nepřihlášené uživatele zobrazit.
K zobrazení výsledku je třeba se přihlásit.
K zobrazení výsledku je třeba se přihlásit.