Matching heard and seen speech: An ERP study of audiovisual word recognition
Autor: | Natalya Kaganovich, Jennifer Schumaker, Courtney Rowland |
---|---|
Rok vydání: | 2016 |
Předmět: |
Male
Linguistics and Language medicine.medical_specialty Speech perception Cognitive Neuroscience Repetition priming Experimental and Cognitive Psychology Audiology Article 050105 experimental psychology Language and Linguistics Young Adult 03 medical and health sciences Speech and Hearing 0302 clinical medicine Hearing medicine Humans Speech 0501 psychology and cognitive sciences Active listening Articulatory gestures Evoked Potentials Gestures 05 social sciences N400 Linguistics Facial Expression Acoustic Stimulation Word recognition Speech Perception Female Comprehension Noise Psychology Articulation (phonetics) Photic Stimulation 030217 neurology & neurosurgery Gesture |
Zdroj: | Brain and Language. :14-24 |
ISSN: | 0093-934X |
DOI: | 10.1016/j.bandl.2016.04.010 |
Popis: | Seeing articulatory gestures while listening to speech-in-noise (SIN) significantly improves speech understanding. However, the degree of this improvement varies greatly among individuals. We examined a relationship between two distinct stages of visual articulatory processing and the SIN accuracy by combining a cross-modal repetition priming task with ERP recordings. Participants first heard a word referring to a common object (e.g., pumpkin) and then decided whether the subsequently presented visual silent articulation matched the word they had just heard. Incongruent articulations elicited a significantly enhanced N400, indicative of a mismatch detection at the pre-lexical level. Congruent articulations elicited a significantly larger LPC, indexing articulatory word recognition. Only the N400 difference between incongruent and congruent trials was significantly correlated with individuals' SIN accuracy improvement in the presence of the talker's face. |
Databáze: | OpenAIRE |
Externí odkaz: |