Evaluating resource-lean cross-lingual embedding models in unsupervised retrieval

Autor: Robert Litschko, Goran Glavaš, Laura Dietz, Ivan Vulić
Rok vydání: 2019
Předmět:
Zdroj: SIGIR
Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval
Laura Dietz
Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval-SIGIR19
Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval -SIGIR'19
DOI: 10.17863/cam.45164
Popis: Cross-lingual embeddings (CLE) facilitate cross-lingual natural language processing and information retrieval. Recently, a wide variety of resource-lean projection-based models for inducing CLEs has been introduced, requiring limited or no bilingual supervision. Despite potential usefulness in downstream IR and NLP tasks, these CLE models have almost exclusively been evaluated on word translation tasks. In this work, we provide a comprehensive comparative evaluation of projection-based CLE models for both sentence-level and document-level cross-lingual Information Retrieval (CLIR). We show that in some settings resource-lean CLE-based CLIR models may outperform resource-intensive models using full-blown machine translation (MT). We hope our work serves as a guideline for choosing the right model for CLIR practitioners.
Databáze: OpenAIRE