Autor: |
David De-Fitero-Dominguez, Eva Garcia-Lopez, Antonio Garcia-Cabot, Jesus-Angel Del-Hoyo-Gabaldon, Antonio Moreno-Cediel |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
IEEE Access, Vol 12, Pp 25580-25589 (2024) |
Druh dokumentu: |
article |
ISSN: |
2169-3536 |
DOI: |
10.1109/ACCESS.2024.3361673 |
Popis: |
In recent years, transformer language models have made a significant impact on automatic text generation. This study focuses on the task of distractor generation in Spanish using a fine-tuned multilingual text-to-text model, namely mT5. Our method outperformed established baselines based on LSTM networks, confirming the effectiveness of Transformer architectures in such NLP tasks. While comparisons with other Transformer-based solutions yielded diverse outcomes based on the metric of choice, our method notably achieved superior results on the ROUGE metric compared to the GPT-2 approach. Although traditional evaluation metrics such as BLEU and ROUGE are commonly used, this paper argues for more context-sensitive metrics given the inherent variability in acceptable distractor generation results. Among the contributions of this research is a comprehensive comparison with other methods, an examination of the potential drawbacks of multilingual models, and the introduction of alternative evaluation metrics. Future research directions, derived from our findings and a review of related works are also suggested, with a particular emphasis on leveraging other language models and Transformer architectures. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|