Positive/Negative Emotion Detection from RGB-D upper Body Images

Autor: Ballihi, Lahoucine, Lablack, Adel, Ben Amor, Boulbaba, Bilasco, Ioan Marius, Daoudi, Mohamed
Přispěvatelé: Ben Amor, Boulbaba, FOX MIIRE (LIFL), Laboratoire d'Informatique Fondamentale de Lille (LIFL), Université de Lille, Sciences et Technologies-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lille, Sciences Humaines et Sociales-Centre National de la Recherche Scientifique (CNRS)-Université de Lille, Sciences et Technologies-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lille, Sciences Humaines et Sociales-Centre National de la Recherche Scientifique (CNRS), Université de Lille, Sciences et Technologies-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lille, Sciences Humaines et Sociales-Centre National de la Recherche Scientifique (CNRS), Université de Lille, Sciences et Technologies
Jazyk: angličtina
Rok vydání: 2014
Předmět:
Zdroj: International Workshop on FFER (Face and Facial Expression Recognition from Real World Videos)-ICPR 2014
International Workshop on FFER (Face and Facial Expression Recognition from Real World Videos)-ICPR 2014, Aug 2014, Stockholm, Sweden
Popis: International audience; The ability to identify users'mental states represents a valu-able asset for improving human-computer interaction. Considering that spontaneous emotions are conveyed mostly through facial expressions and the upper Body movements, we propose to use these modalities together for the purpose of negative/positive emotion classification. A method that allows the recognition of mental states from videos is pro-posed. Based on a dataset composed with RGB-D movies a set of indic-tors of positive and negative is extracted from 2D (RGB) information. In addition, a geometric framework to model the depth flows and capture human body dynamics from depth data is proposed. Due to temporal changes in pixel and depth intensity which characterize spontaneous emo-tions dataset, the depth features are used to define the relation between changes in upper body movements and the affect. We describe a space of depth and texture information to detect the mood of people using upper body postures and their evolution across time. The experimentation has been performed on Cam3D dataset and has showed promising results.
Databáze: OpenAIRE