Improving emotional expression recognition of robots using regions of interest from human data

Autor: Inge M. Hootsmans, Emilia I. Barakova, Pablo V. A. Barros, Matthias Kerzel, Romain H. A. Toebosch, Anne C. Bloem, Lena M. Opheij
Přispěvatelé: Industrial Design, Future Everyday, EAISI Health, Industrial Engineering and Innovation Sciences
Jazyk: angličtina
Rok vydání: 2020
Předmět:
Zdroj: HRI 2020-Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, 142-144
STARTPAGE=142;ENDPAGE=144;TITLE=HRI 2020-Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
HRI (Companion)
Popis: This paper is the first step of an attempt to equip social robots with emotion recognition capabilities comparable to those of humans. Most of the recent deep learning solutions for facial expression recognition under-perform when deployed in Human-Robot-Interaction scenarios, although they are capable of breaking records on the most varied benchmarks on facial expression recognition. The main reason for that we believe is that they are using techniques that are developed for recognition of static pictures, while in real-life scenarios, we infer emotions from intervals of expression. Utilising on the feature of CNN to form regions of interests that are similar to human gaze patterns, we use recordings from human-gaze patterns to train such a network to infer facial emotions from 3 seconds video footage of humans expressing 6 basic emotions.
Databáze: OpenAIRE