The role of stimulus-based cues and conceptual information in processing facial expressions of emotion
Autor: | Thomas Murray, Justin O'Brien, Noam Sagiv, Lúcia Garrido |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2021 |
Předmět: |
Cognitive Neuroscience
media_common.quotation_subject Emotions BF Experimental and Cognitive Psychology shape emotions Stimulus (psychology) Theory of mind Perception Similarity (psychology) surface Humans media_common Facial expression Brain Mapping concepts Brain Superior temporal sulcus Fusiform face area Magnetic Resonance Imaging Expression (mathematics) Facial Expression Neuropsychology and Physiological Psychology faces RC0321 Cues Psychology Cognitive psychology |
ISSN: | 0010-9452 |
Popis: | Data availability: Data (participant RDMs and models) and analysis code for reproducing the results for Experiment 1 are available here: https://osf.io/xgw5a/. Data (raw anonymised imaging data, participant RDMs and models) and analysis code for reproducing the results for Experiment 2 are available here: https://osf.io/34fm7/. Open practices The study in this article earned Open Data, Open Materials and Preregistered badges for transparent practices. Data for this study can be found at https://osf.io/xgw5a/ and https://osf.io/34fm7/ (Experiment 2 data). Appendix A. Supplementary data The following is the supplementary data to this article: Download Word document (60KB), available at: https://ars.els-cdn.com/content/image/1-s2.0-S0010945221002938-mmc1.docx. Face shape and surface textures are two important cues that aid in the perception of facial expressions of emotion. Additionally, this perception is also influenced by high-level emotion concepts. Across two studies, we use representational similarity analysis to investigate the relative roles of shape, surface, and conceptual information in the perception, categorisation, and neural representation of facial expressions. In Study 1, 50 participants completed a perceptual task designed to measure the perceptual similarity of expression pairs, and a categorical task designed to measure the confusability between expression pairs when assigning emotion labels to a face. We used representational similarity analysis and constructed three models of the similarities between emotions using distinct information. Two models were based on stimulus-based cues (face shapes and surface textures) and one model was based on emotion concepts. Using multiple linear regression, we found that behaviour during both tasks was related with the similarity of emotion concepts. The model based on face shapes was more related with behaviour in the perceptual task than in the categorical, and the model based on surface textures was more related with behaviour in the categorical than the perceptual task. In Study 2, 30 participants viewed facial expressions while undergoing fMRI, allowing for the measurement of brain representational geometries of facial expressions of emotion in three core face-responsive regions (the Fusiform Face Area, Occipital Face Area, and Superior Temporal Sulcus), and a region involved in theory of mind (Medial Prefrontal Cortex). Across all four regions, the representational distances between facial expression pairs were related to the similarities of emotion concepts, but not to either of the stimulus-based cues. Together, these results highlight the important top-down influence of high-level emotion concepts both in behavioural tasks and in the neural representation of facial expressions. Brunel University London. |
Databáze: | OpenAIRE |
Externí odkaz: |