Human Scene-Selective Areas Represent 3D Configurations of Surfaces.
Autor: | Lescroart MD; Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA 94720, USA., Gallant JL; Helen Wills Neuroscience Institute, University of California, Berkeley, Berkeley, CA 94720, USA; Department of Psychology, University of California, Berkeley, Berkeley, CA 94720, USA. Electronic address: gallant@berkeley.edu. |
---|---|
Jazyk: | angličtina |
Zdroj: | Neuron [Neuron] 2019 Jan 02; Vol. 101 (1), pp. 178-192.e7. Date of Electronic Publication: 2018 Nov 26. |
DOI: | 10.1016/j.neuron.2018.11.004 |
Abstrakt: | It has been argued that scene-selective areas in the human brain represent both the 3D structure of the local visual environment and low-level 2D features (such as spatial frequency) that provide cues for 3D structure. To evaluate the degree to which each of these hypotheses explains variance in scene-selective areas, we develop an encoding model of 3D scene structure and test it against a model of low-level 2D features. We fit the models to fMRI data recorded while subjects viewed visual scenes. The fit models reveal that scene-selective areas represent the distance to and orientation of large surfaces, at least partly independent of low-level features. Principal component analysis of the model weights reveals that the most important dimensions of 3D structure are distance and openness. Finally, reconstructions of the stimuli based on the model weights demonstrate that our model captures unprecedented detail about the local visual environment from scene-selective areas. (Copyright © 2018 Elsevier Inc. All rights reserved.) |
Databáze: | MEDLINE |
Externí odkaz: |