Sequential Learning of Convolutional Features for Effective Text Classification

Autor: Vijjini Anvesh Rao, Avinash Madasu
Rok vydání: 2019
Předmět:
Zdroj: EMNLP/IJCNLP (1)
DOI: 10.18653/v1/d19-1567
Popis: Text classification has been one of the major problems in natural language processing. With the advent of deep learning, convolutional neural network (CNN) has been a popular solution to this task. However, CNNs which were first proposed for images, face many crucial challenges in the context of text processing, namely in their elementary blocks: convolution filters and max pooling. These challenges have largely been overlooked by the most existing CNN models proposed for text classification. In this paper, we present an experimental study on the fundamental blocks of CNNs in text categorization. Based on this critique, we propose Sequential Convolutional Attentive Recurrent Network (SCARN). The proposed SCARN model utilizes both the advantages of recurrent and convolutional structures efficiently in comparison to previously proposed recurrent convolutional models. We test our model on different text classification datasets across tasks like sentiment analysis and question classification. Extensive experiments establish that SCARN outperforms other recurrent convolutional architectures with significantly less parameters. Furthermore, SCARN achieves better performance compared to equally large various deep CNN and LSTM architectures.
Accepted Long Paper at EMNLP-IJCNLP 2019, Hong Kong, China
Databáze: OpenAIRE