The functional role of neural oscillations in non-verbal emotional communication

Autor: Ashley E Symons, Wael eEl-Deredy, Michael eSchwartze, Sonja A Kotz
Jazyk: angličtina
Rok vydání: 2016
Předmět:
Zdroj: Frontiers in Human Neuroscience, Vol 10 (2016)
Druh dokumentu: article
ISSN: 1662-5161
DOI: 10.3389/fnhum.2016.00239
Popis: Effective interpersonal communication depends on the ability to perceive and interpret nonverbal emotional expressions from multiple sensory modalities. Current theoretical models propose that visual and auditory emotion perception involves a network of brain regions including the primary sensory cortices, the superior temporal sulcus (STS), and orbitofrontal cortex (OFC). However, relatively little is known about how the dynamic interplay between these regions gives rise to the perception of emotions. In recent years, there has been increasing recognition of the importance of neural oscillations in mediating neural communication within and between functional neural networks. Here we review studies investigating changes in oscillatory activity during the perception of visual, auditory, and audiovisual emotional expressions, and aim to characterise the functional role of neural oscillations in nonverbal emotion perception. Findings from the reviewed literature suggest that theta band oscillations most consistently differentiate between emotional and neutral expressions. While early theta synchronisation appears to reflect the initial encoding of emotionally salient sensory information, later fronto-central theta synchronisation may reflect the further integration of sensory information with internal representations. Additionally, gamma synchronisation reflects facilitated sensory binding of emotional expressions within regions such as the OFC, STS, and, potentially, the amygdala. However, the evidence is more ambiguous when it comes to the role of oscillations within the alpha and beta frequencies, which vary as a function of modality (or modalities), presence or absence of predictive information, and attentional or task demands. Thus, the synchronisation of neural oscillations within specific frequency bands mediates the rapid detection, integration, and evaluation of emotional expressions. Moreover, the functional coupling of oscillatory activity across multiples frequency bands supports a predictive coding model of multisensory emotion perception in which emotional facial and body expressions facilitate the processing of emotional vocalisations.
Databáze: Directory of Open Access Journals