Autor: |
Chunyi Yue, Ang Li, Zhenjia Chen, Gan Luan, Siyao Guo |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Applied Sciences, Vol 14, Iss 17, p 7971 (2024) |
Druh dokumentu: |
article |
ISSN: |
2076-3417 |
DOI: |
10.3390/app14177971 |
Popis: |
Domain information plays a crucial role in sentiment analysis. Neural networks that treat domain information as attention can further extract domain-related sentiment features from a shared feature pool, significantly enhancing the accuracy of sentiment analysis. However, when the sentiment polarity within the input text is inconsistent, these methods are unable to further model the relative importance of sentiment information. To address this issue, we propose a novel attention neural network that fully utilizes domain information while also accounting for the relative importance of sentiment information. In our approach, firstly, dual long short-term memory (LSTM) is used to extract features from the input text for domain and sentiment classification, respectively. Following this, a novel attention mechanism is introduced to fuse features to generate the attention distribution. Subsequently, the input text vector obtained based on the weighted summation is fed into the classification layer for sentiment classification. The empirical results from our experiments demonstrate that our method can achieve superior classification accuracies on Amazon multi-domain sentiment analysis datasets. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|