USTCCTSU at SemEval-2024 Task 1: Reducing Anisotropy for Cross-lingual Semantic Textual Relatedness Task
Autor: | Li, Jianjian, Liang, Shengwei, Liao, Yong, Deng, Hongping, Yu, Haiyang |
---|---|
Rok vydání: | 2024 |
Předmět: | |
Zdroj: | In Proceedings of the 18th International Workshop on Semantic Evaluation (SemEval-2024), pages 881-887 |
Druh dokumentu: | Working Paper |
DOI: | 10.18653/v1/2024.semeval-1.126 |
Popis: | Cross-lingual semantic textual relatedness task is an important research task that addresses challenges in cross-lingual communication and text understanding. It helps establish semantic connections between different languages, crucial for downstream tasks like machine translation, multilingual information retrieval, and cross-lingual text understanding.Based on extensive comparative experiments, we choose the XLM-R-base as our base model and use pre-trained sentence representations based on whitening to reduce anisotropy.Additionally, for the given training data, we design a delicate data filtering method to alleviate the curse of multilingualism. With our approach, we achieve a 2nd score in Spanish, a 3rd in Indonesian, and multiple entries in the top ten results in the competition's track C. We further do a comprehensive analysis to inspire future research aimed at improving performance on cross-lingual tasks. Comment: 8 pages, 3 figures |
Databáze: | arXiv |
Externí odkaz: |