Popis: |
Visual object recognition in the real world is not performed in isolation, but is instead dependent on contextual information such as the visual scene an object is found in. And our perceptual experience is not just visual: objects generate specific and unique sounds which can readily predict which objects are outside of our field of view. Here, we test whether and how naturalistic sounds influence visual object processing and demonstrate that auditory information both accelerates visual information processing and modulates the perceptual representation of visual objects. Specifically, using a visual discrimination task and a novel set of ambiguous object stimuli, we find that naturalistic sounds shift visual representations towards the object features that match the sound (Exp. 1a- 1b). In a series of control experiments, we replicate the original effect and show that these effects are not driven by decision- or response biases (Exp. 2a-2b) and are not due to the high-level semantic content of sounds generating explicit expectations (Exp.3). Instead, these sound-induced effects on visual perception appear to be driven by the continuous integration of multisensory inputs during perception itself. Together, our results demonstrate that visual processing is shaped by auditory context which provides independent supplemental information about the entities we encounter in the world. |