XeroAlign: Zero-shot cross-lingual transformer alignment
Autor: | Ignacio Iacobacci, Milan Gritta |
---|---|
Rok vydání: | 2021 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Computation and Language Computer science business.industry SIGNAL (programming language) Natural language understanding computer.software_genre Variety (linguistics) Task (project management) Zero (linguistics) Language model Artificial intelligence business Computation and Language (cs.CL) computer Natural language processing Sentence Transformer (machine learning model) |
Zdroj: | ACL/IJCNLP (Findings) |
DOI: | 10.18653/v1/2021.findings-acl.32 |
Popis: | The introduction of pretrained cross-lingual language models brought decisive improvements to multilingual NLP tasks. However, the lack of labelled task data necessitates a variety of methods aiming to close the gap to high-resource languages. Zero-shot methods in particular, often use translated task data as a training signal to bridge the performance gap between the source and target language(s). We introduce XeroAlign, a simple method for task-specific alignment of cross-lingual pretrained transformers such as XLM-R. XeroAlign uses translated task data to encourage the model to generate similar sentence embeddings for different languages. The XeroAligned XLM-R, called XLM-RA, shows strong improvements over the baseline models to achieve state-of-the-art zero-shot results on three multilingual natural language understanding tasks. XLM-RA's text classification accuracy exceeds that of XLM-R trained with labelled data and performs on par with state-of-the-art models on a cross-lingual adversarial paraphrasing task. Comment: Findings of ACL 2021 - Code: https://github.com/huawei-noah/noah-research/tree/master/xero_align |
Databáze: | OpenAIRE |
Externí odkaz: |