Group Synchrony for Emotion Recognition Using Physiological Signals.

Autor: Bota, Patricia, Zhang, Tianyi, El Ali, Abdallah, Fred, Ana, da Silva, Hugo Placido, Cesar, Pablo
Zdroj: IEEE Transactions on Affective Computing; Oct-Dec2023, Vol. 14 Issue 4, p2614-2625, 12p
Abstrakt: During group interactions, we react and modulate our emotions and behaviour to the group through phenomena including emotion contagion and physiological synchrony. Previous work on emotion recognition through video/image has shown that group context information improves the classification performance. However, when using physiological data, literature mostly focuses on intrapersonal models that leave-out group information, while interpersonal models are unexplored. This paper introduces a new interpersonal Weighted Group Synchrony approach, which relies on Electrodermal Activity (EDA) and Heart-Rate Variability (HRV). We perform an analysis of synchrony metrics applied across diverse data representations (EDA and HRV morphology and features, recurrence plot, spectrogram), to identify which metrics and modalities better characterise physiological synchrony for emotion recognition. We explored two datasets (AMIGOS and K-EmoCon), covering different group sizes (4 vs dyad) and group-based activities (video-watching vs conversation). The experimental results show that integrating group information improves arousal and valence classification, across all datasets, with the exception of K-EmoCon on valence. The proposed method was able to attain mean M-F1 of $\approx$ ≈ 72.15% arousal and 81.16% valence for AMIGOS, and M-F1 of $\approx$ ≈ 52.63% arousal, 65.09% valence for K-EmoCon, surpassing previous work results for K-EmoCon on arousal, and providing a new baseline on AMIGOS for long-videos. [ABSTRACT FROM AUTHOR]
Databáze: Complementary Index