Associative Recurrent Memory Transformer
Autor: | Rodkin, Ivan, Kuratov, Yuri, Bulatov, Aydar, Burtsev, Mikhail |
---|---|
Rok vydání: | 2024 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | This paper addresses the challenge of creating a neural architecture for very long sequences that requires constant time for processing new information at each time step. Our approach, Associative Recurrent Memory Transformer (ARMT), is based on transformer self-attention for local context and segment-level recurrence for storage of task specific information distributed over a long context. We demonstrate that ARMT outperfors existing alternatives in associative retrieval tasks and sets a new performance record in the recent BABILong multi-task long-context benchmark by answering single-fact questions over 50 million tokens with an accuracy of 79.9%. The source code for training and evaluation is available on github. Comment: ICML 2024 Next Generation of Sequence Modeling Architectures Workshop |
Databáze: | arXiv |
Externí odkaz: |