Influence Paths for Characterizing Subject-Verb Number Agreement in LSTM Language Models
Autor: | Matt Fredrikson, Anupam Datta, Kaiji Lu, Klas Leino, Piotr Mardziel |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Computer science media_common.quotation_subject Verb 02 engineering and technology 010501 environmental sciences computer.software_genre 01 natural sciences Subject (grammar) 0202 electrical engineering electronic engineering information engineering Set (psychology) 0105 earth and related environmental sciences media_common Computer Science - Computation and Language business.industry Agreement Recurrent neural network Grammatical number 020201 artificial intelligence & image processing Language model Artificial intelligence business Computation and Language (cs.CL) computer Natural language processing Natural language |
Zdroj: | ACL |
DOI: | 10.18653/v1/2020.acl-main.430 |
Popis: | LSTM-based recurrent neural networks are the state-of-the-art for many natural language processing (NLP) tasks. Despite their performance, it is unclear whether, or how, LSTMs learn structural features of natural languages such as subject-verb number agreement in English. Lacking this understanding, the generality of LSTM performance on this task and their suitability for related tasks remains uncertain. Further, errors cannot be properly attributed to a lack of structural capability, training data omissions, or other exceptional faults. We introduce *influence paths*, a causal account of structural properties as carried by paths across gates and neurons of a recurrent neural network. The approach refines the notion of influence (the subject's grammatical number has influence on the grammatical number of the subsequent verb) into a set of gate or neuron-level paths. The set localizes and segments the concept (e.g., subject-verb agreement), its constituent elements (e.g., the subject), and related or interfering elements (e.g., attractors). We exemplify the methodology on a widely-studied multi-layer LSTM language model, demonstrating its accounting for subject-verb number agreement. The results offer both a finer and a more complete view of an LSTM's handling of this structural aspect of the English language than prior results based on diagnostic classifiers and ablation. Comment: ACL 2020 |
Databáze: | OpenAIRE |
Externí odkaz: |