Self-Attentive Residual Decoder for Neural Machine Translation
Autor: | Nikolaos Pappas, Lesly Miculicich Werlen, Andrei Popescu-Belis, Dhananjay Ram |
---|---|
Rok vydání: | 2018 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Computation and Language Machine translation Computer science Mechanism (biology) Speech recognition Context (language use) 02 engineering and technology 010501 environmental sciences Residual computer.software_genre 01 natural sciences 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Baseline (configuration management) Computation and Language (cs.CL) computer Word (computer architecture) Decoding methods 0105 earth and related environmental sciences |
Zdroj: | NAACL-HLT arXiv.org e-Print Archive Scopus-Elsevier |
DOI: | 10.5281/zenodo.2276145 |
Popis: | Neural sequence-to-sequence networks with attention have achieved remarkable performance for machine translation. One of the reasons for their effectiveness is their ability to capture relevant source-side contextual information at each time-step prediction through an attention mechanism. However, the target-side context is solely based on the sequence model which, in practice, is prone to a recency bias and lacks the ability to capture effectively non-sequential dependencies among words. To address this limitation, we propose a target-side-attentive residual recurrent network for decoding, where attention over previous words contributes directly to the prediction of the next word. The residual learning facilitates the flow of information from the distant past and is able to emphasize any of the previously translated words, hence it gains access to a wider context. The proposed model outperforms a neural MT baseline as well as a memory and self-attention network on three language pairs. The analysis of the attention learned by the decoder confirms that it emphasizes a wider context, and that it captures syntactic-like structures. Accepted on NAACL-HLT 2018, Volume: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers) |
Databáze: | OpenAIRE |
Externí odkaz: |