Zobrazeno 1 - 4
of 4
pro vyhledávání: '"Mia Xu Chen"'
In this paper, we offer a preliminary investigation into the task of in-image machine translation: transforming an image containing text in one language into an image containing the same text in another language. We propose an end-to-end neural model
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::2444f0aaf25c00d8b75130cc82691bfd
Autor:
Ankur Bapna, Yuan Cao, Aditya Siddhant, Mia Xu Chen, Sneha Kudugunta, Orhan Firat, Naveen Arivazhagan, Yonghui Wu
Publikováno v:
ACL
Over the last few years two promising research directions in low-resource neural machine translation (NMT) have emerged. The first focuses on utilizing high-resource languages to improve the quality of low-resource languages via multilingual NMT. The
Publikováno v:
EMNLP
While current state-of-the-art NMT models, such as RNN seq2seq and Transformers, possess a large number of parameters, they are still shallow in comparison to convolutional models used for both text and vision applications. In this work we attempt to
Externí odkaz:
https://explore.openaire.eu/search/publication?articleId=doi_dedup___::ff57b462bb875a78785f38d298211fed
Autor:
Ashish Vaswani, Zhifeng Chen, Mia Xu Chen, Noam Shazeer, Llion Jones, Jakob Uszkoreit, Ankur Bapna, Mike Schuster, Macduff Hughes, Yonghui Wu, George Foster, Melvin Johnson, Niki Parmar, Lukasz Kaiser, Orhan Firat, Wolfgang Macherey
Publikováno v:
ACL (1)
The past year has witnessed rapid advances in sequence-to-sequence (seq2seq) modeling for Machine Translation (MT). The classic RNN-based approaches to MT were first out-performed by the convolutional seq2seq model, which was then out-performed by th