Domain Adaptation for Tibetan-Chinese Neural Machine Translation
Autor: | Rangjia Cai, Maoxian Zhou, Jia Secha |
---|---|
Rok vydání: | 2020 |
Předmět: |
Domain adaptation
Machine translation business.industry Computer science Translation (geometry) computer.software_genre Artificial intelligence business Transfer of learning Baseline (configuration management) computer Natural language processing Sentence Word (computer architecture) Meaning (linguistics) |
Zdroj: | 2020 3rd International Conference on Algorithms, Computing and Artificial Intelligence. |
DOI: | 10.1145/3446132.3446404 |
Popis: | The meaning of the same word or sentence is likely to change in different semantic contexts, which challenges general-purpose translation system to maintain stable performance across different domains. Therefore, domain adaptation is an essential researching topic in Neural Machine Translation practice. In order to efficiently train translation models for different domains, in this work we take the Tibetan-Chinese general translation model as the parent model, and obtain two domain-specific Tibetan-Chinese translation models with small-scale in-domain data. The empirical results indicate that the method provides a positive approach for domain adaptation in low-resource scenarios, resulting in better bleu metrics as well as faster training speed over our general baseline models. |
Databáze: | OpenAIRE |
Externí odkaz: |