Autor: |
Vidhu Mathur, Tanvi Dadu, Swati Aggarwal |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Applied Sciences, Vol 14, Iss 13, p 5440 (2024) |
Druh dokumentu: |
article |
ISSN: |
2076-3417 |
DOI: |
10.3390/app14135440 |
Popis: |
Cross-lingual transfer learning using multilingual models has shown promise for improving performance on natural language processing tasks with limited training data. However, translation can introduce superficial patterns that negatively impact model generalization. This paper evaluates two state-of-the-art multilingual models, Cross-Lingual Model-Robustly Optimized BERT Pretraining Approach (XLM-Roberta) and Multilingual Bi-directional Auto-Regressive Transformer (mBART), on the cross-lingual natural language inference (XNLI) natural language inference task using both original and machine-translated evaluation sets. Our analysis demonstrates that translation can facilitate cross-lingual transfer learning, but maintaining linguistic patterns is critical. The results provide insights into the strengths and limitations of state-of-the-art multilingual natural language processing architectures for cross-lingual understanding. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|