Autor: |
Li, You, Lin, Zhizhou, Lin, Yuming, Yin, Jinhui, Chang, Liang |
Zdroj: |
Cognitive Computation; Nov2023, Vol. 15 Issue 6, p1973-1987, 15p |
Abstrakt: |
Word representation learning is a fundamental technique in cognitive computation that plays a crucial role in enabling machines to understand and process human language. By representing words as vectors in a high-dimensional space, computers can perform complex natural language processing tasks such as sentiment analysis. However, most word representation learning models are trained in open-domain corpora, which results in suboptimal performance in domain-specific tasks. To address this problem, we propose a unified learning framework that leverages external hybrid sentiment knowledge to enhance the sentiment information of word distributed representations. Specifically, we automatically acquire domain- and target-dependent sentiment knowledge from multiple sources. To mitigate knowledge noise, we introduce knowledge expectation and knowledge context weights to filter the acquired knowledge items. Finally, we integrate the filtered sentiment knowledge into the word distributed representations via a learning framework to enrich their semantic information. Extensive experiments are conducted to verify the effectiveness of enhancing sentiment information in word representations for different sentiment analysis tasks. The experimental results show that the proposed models significantly outperform state-of-the-art baselines. Our work demonstrates the advantages of sentiment-enhanced word representations in sentiment analysis tasks and provides insights into the acquisition and fusion of sentiment knowledge from different domains for generating word representations with richer semantics. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|