Zobrazeno 1 - 10
of 40
pro vyhledávání: '"Naveen Arivazhagan"'
Publikováno v:
Interspeech 2021.
Document-level neural machine translation (DocNMT) achieves coherent translations by incorporating cross-sentence context. However, for most language pairs there's a shortage of parallel documents, although parallel sentences are readily available. I
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::4969ea8b265a7546b3c3dc08a3d513ad
Publikováno v:
IWSLT
There has been great progress in improving streaming machine translation, a simultaneous paradigm where the system appends to a growing hypothesis as more source content becomes available. We study a related problem in which revisions to the hypothes
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::39d685370375aa9208e6dc5a19f43119
Autor:
Ankur Bapna, Yuan Cao, Aditya Siddhant, Mia Xu Chen, Sneha Kudugunta, Orhan Firat, Naveen Arivazhagan, Yonghui Wu
Publikováno v:
ACL
Over the last few years two promising research directions in low-resource neural machine translation (NMT) have emerged. The first focuses on utilizing high-resource languages to improve the quality of low-resource languages via multilingual NMT. The
Publikováno v:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Tutorial Abstracts.
Publikováno v:
ICASSP
Neural Machine Translation (NMT) models have demonstrated strong state of the art performance on translation tasks where well-formed training and evaluation data are provided, but they remain sensitive to inputs that include errors of various types.
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::7c8f160dde643b10cbd529d88d83bd3c
Publikováno v:
EMNLP/IJCNLP (1)
Multilingual Neural Machine Translation (NMT) models have yielded large empirical success in transfer learning settings. However, these black-box representations are poorly understood, and their mode of transfer remains elusive. In this work, we atte
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::06bf1b6717501894c86c27917895fe79
http://arxiv.org/abs/1909.02197
http://arxiv.org/abs/1909.02197
Autor:
Chung-Cheng Chiu, Colin Raffel, Naveen Arivazhagan, Ruoming Pang, Colin Cherry, Wei Li, Semih Yavuz, Wolfgang Macherey
Publikováno v:
ACL (1)
Simultaneous machine translation begins to translate each source sentence before the source speaker is finished speaking, with applications to live and streaming scenarios. Simultaneous systems must carefully schedule their reading of the source sent
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::a327f333a7c39fddce90ade95b3c7e51
http://arxiv.org/abs/1906.05218
http://arxiv.org/abs/1906.05218
Publikováno v:
EMNLP/IJCNLP (1)
We propose a practical scheme to train a single multilingual sequence labeling model that yields state of the art results and is small and fast enough to run on a single CPU. Starting from a public multilingual BERT checkpoint, our final model is 6x
Publikováno v:
EMNLP/IJCNLP (1)
Fine-tuning pre-trained Neural Machine Translation (NMT) models is the dominant approach for adapting to new languages and domains. However, fine-tuning requires adapting and maintaining a separate model for each target task. We propose a simple yet