Cross-Domain Aspect Extraction using Transformers Augmented with Knowledge Graphs

Autor: Howard, Phillip, Ma, Arden, Lal, Vasudev, Simoes, Ana Paula, Korat, Daniel, Pereg, Oren, Wasserblat, Moshe, Singer, Gadi
Rok vydání: 2022
Předmět:
Zdroj: Proceedings of the 31st ACM International Conference on Information & Knowledge Management (CIKM 2022). Association for Computing Machinery, New York, NY, USA, 780-790
Druh dokumentu: Working Paper
DOI: 10.1145/3511808.3557275
Popis: The extraction of aspect terms is a critical step in fine-grained sentiment analysis of text. Existing approaches for this task have yielded impressive results when the training and testing data are from the same domain. However, these methods show a drastic decrease in performance when applied to cross-domain settings where the domain of the testing data differs from that of the training data. To address this lack of extensibility and robustness, we propose a novel approach for automatically constructing domain-specific knowledge graphs that contain information relevant to the identification of aspect terms. We introduce a methodology for injecting information from these knowledge graphs into Transformer models, including two alternative mechanisms for knowledge insertion: via query enrichment and via manipulation of attention patterns. We demonstrate state-of-the-art performance on benchmark datasets for cross-domain aspect term extraction using our approach and investigate how the amount of external knowledge available to the Transformer impacts model performance.
Databáze: arXiv