Autor: |
Changfu Pei, Xunan Huang, Yuqin Li, Baodan Chen, Bin Lu, Yueheng Peng, Yajing Si, Xiabing Zhang, Tao Zhang, Dezhong Yao, Fali Li, Peng Xu |
Rok vydání: |
2022 |
Předmět: |
|
Zdroj: |
Neuroscience. 502 |
ISSN: |
1873-7544 |
Popis: |
Language is a remarkable cognitive ability that can be expressed through visual (written language) or auditory (spoken language) modalities. When visual characters and auditory speech convey conflicting information, individuals may selectively attend to either one of them. However, the dominant modality in such a competing situation and the neural mechanism underlying it are still unclear. Here, we presented participants with Chinese sentences in which the visual characters and auditory speech convey conflicting information, while behavioral and electroencephalographic (EEG) responses were recorded. Results showed a prominent auditory dominance when audio-visual competition occurred. Specifically, higher accuracy (ACC), larger N400 amplitudes and more linkages in the posterior occipital-parietal areas were demonstrated in the auditory mismatch condition compared to that in the visual mismatch condition. Our research illustrates the superiority of the auditory speech over the visual characters, extending our understanding of the neural mechanisms of audio-visual competition in Chinese. |
Databáze: |
OpenAIRE |
Externí odkaz: |
|