Multiple spatial reference frames underpin perceptual recalibration to audio-visual discrepancies
Autor: | Neil W. Roach, Michael A. Akeroyd, David M. Watson, Ben S. Webb |
---|---|
Přispěvatelé: | Price, Nicholas Seow Chiang |
Rok vydání: | 2021 |
Předmět: |
Male
Eye Movements Vision Physiology Visual System Computer science Audio Signal Processing Speech recognition Sensory Physiology Social Sciences computer.software_genre Mathematical and Statistical Techniques Psychology Visual Signals Audio signal processing media_common Multidisciplinary Orientation (computer vision) Physics Statistics General Medicine Adaptation Physiological Sensory Systems Physical Sciences Auditory Perception Visual Perception Engineering and Technology Regression Analysis Medicine Female Sensory Perception General Agricultural and Biological Sciences Research Article Reference frame Adult Science media_common.quotation_subject Adaptation (eye) Linear Regression Analysis Research and Analysis Methods General Biochemistry Genetics and Molecular Biology Young Adult Perceptual system Perception Acoustic Signals Humans Sound Localization Statistical Methods Analysis of Variance Cognitive Psychology Biology and Life Sciences Eye movement Acoustics Acoustic Stimulation Signal Processing Fixation (visual) Cognitive Science computer Photic Stimulation Mathematics Neuroscience |
Zdroj: | PLoS ONE, Vol 16, Iss 5, p e0251827 (2021) PLoS ONE |
ISSN: | 1932-6203 |
DOI: | 10.1371/journal.pone.0251827 |
Popis: | In dynamic multisensory environments, the perceptual system corrects for discrepancies arising between modalities. For instance, in the ventriloquism aftereffect (VAE), spatial disparities introduced between visual and auditory stimuli lead to a perceptual recalibration of auditory space. Previous research has shown that the VAE is underpinned by multiple recalibration mechanisms tuned to different timescales, however it remains unclear whether these mechanisms use common or distinct spatial reference frames. Here we asked whether the VAE operates in eye- or head-centred reference frames across a range of adaptation timescales, from a few seconds to a few minutes. We developed a novel paradigm for selectively manipulating the contribution of eye- versus head-centred visual signals to the VAE by manipulating auditory locations relative to either the head orientation or the point of fixation. Consistent with previous research, we found both eye- and head-centred frames contributed to the VAE across all timescales. However, we found no evidence for an interaction between spatial reference frames and adaptation duration. Our results indicate that the VAE is underpinned by multiple spatial reference frames that are similarly leveraged by the underlying time-sensitive mechanisms. |
Databáze: | OpenAIRE |
Externí odkaz: |