Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

Autor: Jinhua Du, Jingguang Han, Dadong Wan, Andy Way
Jazyk: angličtina
Rok vydání: 2018
Předmět:
Zdroj: Du, Jinhua ORCID: 0000-0002-3267-4881 , Han, Jingguang, Way, Andy ORCID: 0000-0001-5736-5930 and Wan, Dadong (2018) Multi-level structured self-attentions for distantly supervised relation extraction. In: 2018 Conference on Empirical Methods in Natural Language Processing, 31 Oct-4 Nov 2018, Brussels, Belgium.
EMNLP
Popis: Attention mechanisms are often used in deep neural networks for distantly supervised relation extraction (DS-RE) to distinguish valid from noisy instances. However, traditional 1-D vector attention models are insufficient for the learning of different contexts in the selection of valid instances to predict the relationship for an entity pair. To alleviate this issue, we propose a novel multi-level structured (2-D matrix) self-attention mechanism for DS-RE in a multi-instance learning (MIL) framework using bidirectional recurrent neural networks. In the proposed method, a structured word-level self-attention mechanism learns a 2-D matrix where each row vector represents a weight distribution for different aspects of an instance regarding two entities. Targeting the MIL issue, the structured sentence-level attention learns a 2-D matrix where each row vector represents a weight distribution on selection of different valid in-stances. Experiments conducted on two publicly available DS-RE datasets show that the proposed framework with a multi-level structured self-attention mechanism significantly outperform state-of-the-art baselines in terms of PR curves, P@N and F1 measures.
Accepted by EMNLP2018
Databáze: OpenAIRE