Efficient Attention using a Fixed-Size Memory Representation
Autor: | Denny Britz, Minh-Thang Luong, Melody Y. Guan |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2017 |
Předmět: |
FOS: Computer and information sciences
Theoretical computer science Computer Science - Computation and Language Computer science Inference 02 engineering and technology 010501 environmental sciences Translation (geometry) 01 natural sciences Encoding (memory) 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Representation (mathematics) Encoder Computation and Language (cs.CL) 0105 earth and related environmental sciences |
Zdroj: | EMNLP |
Popis: | The standard content-based attention mechanism typically used in sequence-to-sequence models is computationally expensive as it requires the comparison of large encoder and decoder states at each time step. In this work, we propose an alternative attention mechanism based on a fixed size memory representation that is more efficient. Our technique predicts a compact set of K attention contexts during encoding and lets the decoder compute an efficient lookup that does not need to consult the memory. We show that our approach performs on-par with the standard attention mechanism while yielding inference speedups of 20% for real-world translation tasks and more for tasks with longer sequences. By visualizing attention scores we demonstrate that our models learn distinct, meaningful alignments. EMNLP 2017 |
Databáze: | OpenAIRE |
Externí odkaz: |