Semantic Entropy in Language Comprehension

Autor: Matthew W. Crocker, Harm Brouwer, Noortje J. Venhuizen
Jazyk: angličtina
Rok vydání: 2019
Předmět:
Zdroj: Entropy
Volume 21
Issue 12
ISSN: 1099-4300
DOI: 10.3390/e21121159
Popis: Language is processed on a more or less word-by-word basis, and the processing difficulty induced by each word is affected by our prior linguistic experience as well as our general knowledge about the world. Surprisal and entropy reduction have been independently proposed as linking theories between word processing difficulty and probabilistic language models. Extant models, however, are typically limited to capturing linguistic experience and hence cannot account for the influence of world knowledge. A recent comprehension model by Venhuizen, Crocker, and Brouwer (2019, Discourse Processes) improves upon this situation by instantiating a comprehension-centric metric of surprisal that integrates linguistic experience and world knowledge at the level of interpretation and combines them in determining online expectations. Here, we extend this work by deriving a comprehension-centric metric of entropy reduction from this model. In contrast to previous work, which has found that surprisal and entropy reduction are not easily dissociated, we do find a clear dissociation in our model. While both surprisal and entropy reduction derive from the same cognitive process&mdash
the word-by-word updating of the unfolding interpretation&mdash
they reflect different aspects of this process: state-by-state expectation (surprisal) versus end-state confirmation (entropy reduction).
Databáze: OpenAIRE
Nepřihlášeným uživatelům se plný text nezobrazuje