Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots
Autor: | Ravi Vaidyanathan, Richard Craig, Appolinaire C. Etoundi, Christopher J. James, Chris Melhuish, Prashant Iyengar, Sneh Vaswani, Payam Barnaghi, Maitreyee Wairagkar, Hugo Weissbart, Maria R. Lima, Tobias Reichenbach, Daniel Bazo |
---|---|
Přispěvatelé: | Medical Research Council |
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
cs.RO Computer Networks and Communications Computer science 270 Language and Computation in Neural Systems 0805 Distributed Computing 02 engineering and technology TS Display device Computer Science - Robotics Human–computer interaction 1005 Communications Technologies 0202 electrical engineering electronic engineering information engineering Feature (machine learning) Abstraction (linguistics) Facial expression Social robot technology industry and agriculture 020206 networking & telecommunications Computer Science Applications body regions Emotive Hardware and Architecture Face (geometry) Signal Processing Robot 020201 artificial intelligence & image processing Robotics (cs.RO) human activities Information Systems |
Zdroj: | IEEE Internet of Things Journal, 9, 3174-3188 IEEE Internet of Things Journal, 9, 5, pp. 3174-3188 |
ISSN: | 3174-3188 2327-4662 |
Popis: | We introduce the conceptual formulation, design, fabrication, control and commercial translation with IoT connection of a hybrid-face social robot and validation of human emotional response to its affective interactions. The hybrid-face robot integrates a 3D printed faceplate and a digital display to simplify conveyance of complex facial movements while providing the impression of three-dimensional depth for natural interaction. We map the space of potential emotions of the robot to specific facial feature parameters and characterise the recognisability of the humanoid hybrid-face robot's archetypal facial expressions. We introduce pupil dilation as an additional degree of freedom for conveyance of emotive states. Human interaction experiments demonstrate the ability to effectively convey emotion from the hybrid-robot face to human observers by mapping their neurophysiological electroencephalography (EEG) response to perceived emotional information and through interviews. Results show main hybrid-face robotic expressions can be discriminated with recognition rates above 80% and invoke human emotive response similar to that of actual human faces as measured by the face-specific N170 event-related potentials in EEG. The hybrid-face robot concept has been modified, implemented, and released in the commercial IoT robotic platform Miko (My Companion), an affective robot with facial and conversational features currently in use for human-robot interaction in children by Emotix Inc. We demonstrate that human EEG responses to Miko emotions are comparative to neurophysiological responses for actual human facial recognition. Finally, interviews show above 90% expression recognition rates in our commercial robot. We conclude that simplified hybrid-face abstraction conveys emotions effectively and enhances human-robot interaction. This work has been submitted to the IEEE for possible publication. Copyright may be transferred without notice, after which this version may no longer be accessible |
Databáze: | OpenAIRE |
Externí odkaz: |