Are Neighbors Enough? Multi-Head Neural n-gram can be Alternative to Self-attention

Autor: Loem, Mengsay, Takase, Sho, Kaneko, Masahiro, Okazaki, Naoaki
Rok vydání: 2022
Předmět:
Druh dokumentu: Working Paper
Popis: Impressive performance of Transformer has been attributed to self-attention, where dependencies between entire input in a sequence are considered at every position. In this work, we reform the neural $n$-gram model, which focuses on only several surrounding representations of each position, with the multi-head mechanism as in Vaswani et al.(2017). Through experiments on sequence-to-sequence tasks, we show that replacing self-attention in Transformer with multi-head neural $n$-gram can achieve comparable or better performance than Transformer. From various analyses on our proposed method, we find that multi-head neural $n$-gram is complementary to self-attention, and their combinations can further improve performance of vanilla Transformer.
Databáze: arXiv