Choosing Transfer Languages for Cross-Lingual Learning
Autor: | Zirui Li, Patrick Littell, Yuyan Zhang, Junxian He, Jean Lee, Graham Neubig, Chian-Yu Chen, Mengzhou Xia, Zhisong Zhang, Xuezhe Ma, Yu-Hsiang Lin, Antonios Anastasopoulos, Shruti Rijhwani |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2019 |
Předmět: |
FOS: Computer and information sciences
Cross lingual Computer Science - Computation and Language business.industry Computer science 02 engineering and technology computer.software_genre 03 medical and health sciences 0302 clinical medicine 030221 ophthalmology & optometry 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Artificial intelligence business Computation and Language (cs.CL) computer Natural language processing |
Zdroj: | ACL (1) |
Popis: | Cross-lingual transfer, where a high-resource transfer language is used to improve the accuracy of a low-resource task language, is now an invaluable tool for improving performance of natural language processing (NLP) on low-resource languages. However, given a particular task language, it is not clear which language to transfer from, and the standard strategy is to select languages based on ad hoc criteria, usually the intuition of the experimenter. Since a large number of features contribute to the success of cross-lingual transfer (including phylogenetic similarity, typological properties, lexical overlap, or size of available data), even the most enlightened experimenter rarely considers all these factors for the particular task at hand. In this paper, we consider this task of automatically selecting optimal transfer languages as a ranking problem, and build models that consider the aforementioned features to perform this prediction. In experiments on representative NLP tasks, we demonstrate that our model predicts good transfer languages much better than ad hoc baselines considering single features in isolation, and glean insights on what features are most informative for each different NLP tasks, which may inform future ad hoc selection even without use of our method. Code, data, and pre-trained models are available at https://github.com/neulab/langrank Proceedings of ACL 2019 |
Databáze: | OpenAIRE |
Externí odkaz: |