Predicting Machine Translation Performance on Low-Resource Languages: The Role of Domain Similarity

Autor: Khiu, Eric, Toossi, Hasti, Anugraha, David, Liu, Jinyu, Li, Jiaxu, Flores, Juan Armando Parra, Roman, Leandro Acros, Doğruöz, A. Seza, Lee, En-Shiun Annie
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: Fine-tuning and testing a multilingual large language model is expensive and challenging for low-resource languages (LRLs). While previous studies have predicted the performance of natural language processing (NLP) tasks using machine learning methods, they primarily focus on high-resource languages, overlooking LRLs and shifts across domains. Focusing on LRLs, we investigate three factors: the size of the fine-tuning corpus, the domain similarity between fine-tuning and testing corpora, and the language similarity between source and target languages. We employ classical regression models to assess how these factors impact the model's performance. Our results indicate that domain similarity has the most critical impact on predicting the performance of Machine Translation models.
Comment: 13 pages, 5 figures, accepted to EACL 2024, findings
Databáze: arXiv