Neural vs. Phrase-Based Machine Translation in a Multi-Domain Scenario

Autor: Matteo Negri, Marcello Federico, Nicola Bertoldi, Marco Turchi, M. Amin Farajian
Rok vydání: 2017
Předmět:
Zdroj: EACL (2)
DOI: 10.18653/v1/e17-2045
Popis: State-of-the-art neural machine translation (NMT) systems are generally trained on specific domains by carefully selecting the training sets and applying proper domain adaptation techniques. In this paper we consider the real world scenario in which the target domain is not predefined, hence the system should be able to translate text from multiple domains. We compare the performance of a generic NMT system and phrase-based statistical machine translation (PBMT) system by training them on a generic parallel corpus composed of data from different domains. Our results on multi-domain English-French data show that, in these realistic conditions, PBMT outperforms its neural counterpart. This raises the question: is NMT ready for deployment as a generic/multi-purpose MT backbone in real-world settings?
Databáze: OpenAIRE