Popis: |
Research indicates that auditory and visual information is integrated during the perception of speech. Conflicting auditory and visual stimuli can result in an illusory experience known as the McGurk effect (e.g., auditory /bav/ dubbed onto a face saying /gav/ results in a perception of ‘‘dav’’). This study used a priming paradigm to investigate whether a phonemic representation for the auditory portion of a McGurk stimulus is active after the illusory phoneme is experienced. Subjects were given (nonword) prime‐target conditions, including: (1) McGurk (e.g., Prime auditory /bav/ + visual /gav/ = ‘‘dav;’’ Target auditory /bav/); (2) Incongruent (e.g., Prime auditory‐visual /mav/, Target auditory /bav/); (3) Identity (e.g., Prime auditory‐visual /yav/, Target auditory /yav/). Results show that mean reaction times to repeat targets were fastest in the identity condition. Response times for the McGurk and incongruent conditions were indistinguishable from one another and significantly slower than the identity condition. This finding suggests that once the auditory and visual information is combined and a phonemic representation is made, the actual auditory signal is no longer available to affect processing of the target. [This work is based on ideas developed by the late Kerry P. Green and supported by NSF.] |