SG-Net: Syntax Guided Transformer for Language Representation
Autor: | Yuwei Wu, Rui Wang, Junru Zhou, Zhuosheng Zhang, Sufeng Duan, Hai Zhao |
---|---|
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
Dependency (UML) Machine translation Computer Science - Artificial Intelligence Computer science computer.software_genre Computer Science - Information Retrieval Artificial Intelligence Humans Representation (mathematics) Language Transformer (machine learning model) Computer Science - Computation and Language business.industry Applied Mathematics Linguistics DUAL (cognitive architecture) Syntax Artificial Intelligence (cs.AI) Computational Theory and Mathematics Computer Vision and Pattern Recognition Artificial intelligence business Computation and Language (cs.CL) computer Encoder Algorithms Information Retrieval (cs.IR) Software Word (computer architecture) |
Zdroj: | IEEE Transactions on Pattern Analysis and Machine Intelligence. 44:3285-3299 |
ISSN: | 1939-3539 0162-8828 |
Popis: | Understanding human language is one of the key themes of artificial intelligence. For language representation, the capacity of effectively modeling the linguistic knowledge from the detail-riddled and lengthy texts and getting rid of the noises is essential to improve its performance. Traditional attentive models attend to all words without explicit constraint, which results in inaccurate concentration on some dispensable words. In this work, we propose using syntax to guide the text modeling by incorporating explicit syntactic constraints into attention mechanisms for better linguistically motivated word representations. In detail, for self-attention network (SAN) sponsored Transformer-based encoder, we introduce syntactic dependency of interest (SDOI) design into the SAN to form an SDOI-SAN with syntax-guided self-attention. Syntax-guided network (SG-Net) is then composed of this extra SDOI-SAN and the SAN from the original Transformer encoder through a dual contextual architecture for better linguistics inspired representation. The proposed SG-Net is applied to typical Transformer encoders. Extensive experiments on popular benchmark tasks, including machine reading comprehension, natural language inference, and neural machine translation show the effectiveness of the proposed SG-Net design. The early version accepted by IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI). Journal extension of arXiv:1908.05147 (AAAI 2020) |
Databáze: | OpenAIRE |
Externí odkaz: |