Transfer Learning for Classifying Spanish and English Text by Clinical Specialties

Autor: Stefan Schulz, Alexandra Pomares-Quimbaya, Pilar López-Úbeda
Rok vydání: 2021
Předmět:
Zdroj: MIE
Popis: Transfer learning has demonstrated its potential in natural language processing tasks, where models have been pre-trained on large corpora and then tuned to specific tasks. We applied pre-trained transfer models to a Spanish biomedical document classification task. The main goal is to analyze the performance of text classification by clinical specialties using state-of-the-art language models for Spanish, and compared them with the results using corresponding models in English and with the most important pre-trained model for the biomedical domain. The outcomes present interesting perspectives on the performance of language models that are pre-trained for a particular domain. In particular, we found that BioBERT achieved better results on Spanish texts translated into English than the general domain model in Spanish and the state-of-the-art multilingual model.
Databáze: OpenAIRE