Long Short-Term Memory With Quadratic Connections in Recursive Neural Networks for Representing Compositional Semantics
Autor: | Mingmin Chi, Dong Wu |
---|---|
Rok vydání: | 2017 |
Předmět: |
General Computer Science
Computer science Principle of compositionality Treebank 02 engineering and technology computer.software_genre Semantics Semantic similarity 0202 electrical engineering electronic engineering information engineering General Materials Science Artificial neural network business.industry Sentiment analysis nonlinear connections General Engineering deep learning 020207 software engineering Recurrent neural network Recurrent neural networks long short-term memory networks 020201 artificial intelligence & image processing lcsh:Electrical engineering. Electronics. Nuclear engineering Artificial intelligence business lcsh:TK1-9971 computer Natural language processing Natural language Sentence |
Zdroj: | IEEE Access, Vol 5, Pp 16077-16083 (2017) |
ISSN: | 2169-3536 |
DOI: | 10.1109/access.2016.2647384 |
Popis: | Long short-term memory (LSTM) has been widely used in different applications, such as natural language processing, speech recognition, and computer vision over recurrent neural network (RNN) or recursive neural network (RvNN)-a tree-structured RNN. In addition, the LSTM-RvNN has been used to represent compositional semantics through the connections of hidden vectors over child units. However, the linear connections in the existing LSTM networks are incapable of capturing complex semantic representations of natural language texts. For example, complex structures in natural language texts usually denote intricate relationships between words, such as negated sentiment or sentiment strengths. In this paper, quadratic connections of the LSTM model is proposed in terms of RvNNs (abbreviated as qLSTM-RvNN) in order to attack the problem of representing compositional semantics. The proposed qLSTM-RvNN model is evaluated in the benchmark data sets containing semantic compositionality, i.e., sentiment analysis on Stanford Sentiment Treebank and semantic relatedness on sentences involving compositional knowledge data set. Empirical results show that it outperforms the state-of-the-art RNN, RvNN, and LSTM networks in two semantic compositionality tasks by increasing the classification accuracies and sentence correlation while significantly decreasing computational complexities. |
Databáze: | OpenAIRE |
Externí odkaz: |