Diversity driven Attention Model for Query-based Abstractive Summarization
Autor: | Nema, Preksha, Khapra, Mitesh, Laha, Anirban, Ravindran, Balaraman |
---|---|
Rok vydání: | 2017 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Abstractive summarization aims to generate a shorter version of the document covering all the salient points in a compact and coherent fashion. On the other hand, query-based summarization highlights those points that are relevant in the context of a given query. The encode-attend-decode paradigm has achieved notable success in machine translation, extractive summarization, dialog systems, etc. But it suffers from the drawback of generation of repeated phrases. In this work we propose a model for the query-based summarization task based on the encode-attend-decode paradigm with two key additions (i) a query attention model (in addition to document attention model) which learns to focus on different portions of the query at different time steps (instead of using a static representation for the query) and (ii) a new diversity based attention model which aims to alleviate the problem of repeating phrases in the summary. In order to enable the testing of this model we introduce a new query-based summarization dataset building on debatepedia. Our experiments show that with these two additions the proposed model clearly outperforms vanilla encode-attend-decode models with a gain of 28% (absolute) in ROUGE-L scores. Comment: Accepted at ACL 2017 |
Databáze: | arXiv |
Externí odkaz: |