From Random to Informed Data Selection: A Diversity-Based Approach to Optimize Human Annotation and Few-Shot Learning

Autor: Alcoforado, Alexandre, Ferraz, Thomas Palmeira, Okamura, Lucas Hideki, Fama, Israel Campos, Lavado, Arnold Moya, Bueno, Bárbara Dias, Veloso, Bruno, Costa, Anna Helena Reali
Rok vydání: 2024
Předmět:
Druh dokumentu: Working Paper
Popis: A major challenge in Natural Language Processing is obtaining annotated data for supervised learning. An option is the use of crowdsourcing platforms for data annotation. However, crowdsourcing introduces issues related to the annotator's experience, consistency, and biases. An alternative is to use zero-shot methods, which in turn have limitations compared to their few-shot or fully supervised counterparts. Recent advancements driven by large language models show potential, but struggle to adapt to specialized domains with severely limited data. The most common approaches therefore involve the human itself randomly annotating a set of datapoints to build initial datasets. But randomly sampling data to be annotated is often inefficient as it ignores the characteristics of the data and the specific needs of the model. The situation worsens when working with imbalanced datasets, as random sampling tends to heavily bias towards the majority classes, leading to excessive annotated data. To address these issues, this paper contributes an automatic and informed data selection architecture to build a small dataset for few-shot learning. Our proposal minimizes the quantity and maximizes diversity of data selected for human annotation, while improving model performance.
Comment: Accepted at PROPOR 2024 - The 16th International Conference on Computational Processing of Portuguese
Databáze: arXiv