SALNet: Semi-supervised Few-Shot Text Classification with Attention-based Lexicon Construction
Autor: | Ju-Hyoung Lee, Sang-Ki Ko, Yo-Sub Han |
---|---|
Rok vydání: | 2021 |
Předmět: | |
Zdroj: | Proceedings of the AAAI Conference on Artificial Intelligence. 35:13189-13197 |
ISSN: | 2374-3468 2159-5399 |
Popis: | We propose a semi-supervised bootstrap learning framework for few-shot text classification. From a small amount of the initial dataset, our framework obtains a larger set of reliable training data by using the attention weights from an LSTM-based trained classifier. We first train an LSTM-based text classifier from a given labeled dataset using the attention mechanism. Then, we collect a set of words for each class called a lexicon, which is supposed to be a representative set of words for each class based on the attention weights calculated for the classification task. We bootstrap the classifier using the new data that are labeled by the combination of the classifier and the constructed lexicons to improve the prediction accuracy. As a result, our approach outperforms the previous state-of-the-art methods including semi-supervised learning algorithms and pretraining algorithms for few-shot text classification task on four publicly available benchmark datasets. Moreover, we empirically confirm that the constructed lexicons are reliable enough and substantially improve the performance of the original classifier. |
Databáze: | OpenAIRE |
Externí odkaz: |