ZeroBERTo: Leveraging Zero-Shot Text Classification by topic modeling
Autor: | Alexandre Alcoforado, Thomas Palmeira Ferraz, Rodrigo Gerber, Enzo Bustos, André Seidel Oliveira, Bruno Miguel Veloso, Fabio Levy Siqueira, Anna Helena Reali Costa |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Machine Learning Artificial intelligence Computer Science - Computation and Language Artificial Intelligence (cs.AI) Computer Science - Artificial Intelligence Learning paradigms Natural language processing Machine learning Computation and Language (cs.CL) Supervised learning Supervised learning by classification Machine Learning (cs.LG) |
Zdroj: | Lecture Notes in Computer Science ISBN: 9783030983048 |
Popis: | Traditional text classification approaches often require a good amount of labeled data, which is difficult to obtain, especially in restricted domains or less widespread languages. This lack of labeled data has led to the rise of low-resource methods, that assume low data availability in natural language processing. Among them, zero-shot learning stands out, which consists of learning a classifier without any previously labeled data. The best results reported with this approach use language models such as Transformers, but fall into two problems: high execution time and inability to handle long texts as input. This paper proposes a new model, ZeroBERTo, which leverages an unsupervised clustering step to obtain a compressed data representation before the classification task. We show that ZeroBERTo has better performance for long inputs and shorter execution time, outperforming XLM-R by about 12% in the F1 score in the FolhaUOL dataset. Keywords: Low-Resource NLP, Unlabeled data, Zero-Shot Learning, Topic Modeling, Transformers. Comment: Accepted at PROPOR 2022: 15th International Conference on Computational Processing of Portuguese |
Databáze: | OpenAIRE |
Externí odkaz: |