Popis: |
Aspect-context sentiment classification aims to classify the sentiments about an aspect that corresponds to its context. Typically, machine learning models considers the aspect and context separately. They do not execute the aspect and context in parallel. To model the contexts and aspects separately, most of the methods with attention mechanisms typically employ the Long Short Term Memory network approach. Attention mechanisms, on the other hand, take this into account and compute the parallel sequencing of the aspects-context. The interactive attention mechanism extracts features of a specific aspect regarding its context in the sequence, which means aspects are considered when generating context sequence representations. However, when determining the relationship between words in a sentence, the interactive attention mechanism does not consider semantic dependency information. Moreover, the attention mechanisms did not capture the polysemous words. Normally conventional embedding models, such as GloVe word vectors, have been used. In this study, transformers are embedded into the attention mechanism approaches to overcome the semantic relationship problem. For this reason, the BERT pre-train language model is used to capture the relationship among the words in a sentence. The interactive attention mechanism is then applied to the model’s distribution of that word. The final sequence-to-sequence representation in terms of context and aspect is used into general machine learning classifiers for aspect-level sentiment classification. The proposed model was evaluated on the two datasets, i.e., Restaurant and Laptop review. The proposed approach has state-of-the-art results with all attention mechanisms and attained significantly better performance than the existing ones. |