Context-Aware Neural Model for Temporal Information Extraction
Autor: | Yuanliang Meng, Anna Rumshisky |
---|---|
Rok vydání: | 2018 |
Předmět: |
Source code
Artificial neural network Computer science Process (engineering) business.industry media_common.quotation_subject Context (language use) 010501 environmental sciences Machine learning computer.software_genre 01 natural sciences 030507 speech-language pathology & audiology 03 medical and health sciences Turing machine symbols.namesake Text processing symbols Artificial intelligence Layer (object-oriented design) 0305 other medical science business computer 0105 earth and related environmental sciences media_common |
Zdroj: | ACL (1) |
DOI: | 10.18653/v1/p18-1049 |
Popis: | We propose a context-aware neural network model for temporal information extraction. This model has a uniform architecture for event-event, event-timex and timex-timex pairs. A Global Context Layer (GCL), inspired by Neural Turing Machine (NTM), stores processed temporal relations in narrative order, and retrieves them for use when relevant entities come in. Relations are then classified in context. The GCL model has long-term memory and attention mechanisms to resolve irregular long-distance dependencies that regular RNNs such as LSTM cannot recognize. It does not require any new input features, while outperforming the existing models in literature. To our knowledge it is also the first model to use NTM-like architecture to process the information from global context in discourse-scale natural text processing. We are going to release the source code in the future. |
Databáze: | OpenAIRE |
Externí odkaz: |