Cross-Linguistic Syntactic Evaluation of Word Prediction Models
Autor: | Panayiota Petrou-Zeniou, Aaron Mueller, Tal Linzen, Garrett Nicolai, Natalia Talmina |
---|---|
Rok vydání: | 2020 |
Předmět: |
FOS: Computer and information sciences
Morphology (linguistics) Computer science media_common.quotation_subject Object (grammar) 02 engineering and technology computer.software_genre 050105 experimental psychology German Rule-based machine translation 0202 electrical engineering electronic engineering information engineering 0501 psychology and cognitive sciences media_common Computer Science - Computation and Language business.industry Hebrew 05 social sciences Syntax language.human_language Agreement language 020201 artificial intelligence & image processing Artificial intelligence Language model business Computation and Language (cs.CL) computer Natural language processing Word (computer architecture) |
Zdroj: | ACL |
DOI: | 10.18653/v1/2020.acl-main.490 |
Popis: | A range of studies have concluded that neural word prediction models can distinguish grammatical from ungrammatical sentences with high accuracy. However, these studies are based primarily on monolingual evidence from English. To investigate how these models' ability to learn syntax varies by language, we introduce CLAMS (Cross-Linguistic Assessment of Models on Syntax), a syntactic evaluation suite for monolingual and multilingual models. CLAMS includes subject-verb agreement challenge sets for English, French, German, Hebrew and Russian, generated from grammars we develop. We use CLAMS to evaluate LSTM language models as well as monolingual and multilingual BERT. Across languages, monolingual LSTMs achieved high accuracy on dependencies without attractors, and generally poor accuracy on agreement across object relative clauses. On other constructions, agreement accuracy was generally higher in languages with richer morphology. Multilingual models generally underperformed monolingual models. Multilingual BERT showed high syntactic accuracy on English, but noticeable deficiencies in other languages. Comment: Accepted for presentation at ACL 2020 |
Databáze: | OpenAIRE |
Externí odkaz: |