Haptic Human-Robot Affective Interaction in a Handshaking Social Protocol
Autor: | Yacine Tsalamlal, Yoren Gaffary, Adriana Tapus, Virginie Demulier, Mehdi Ammi, Sylvain Caillou, Jean-Claude Martin |
---|---|
Přispěvatelé: | Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur (LIMSI), Université Paris Saclay (COmUE)-Centre National de la Recherche Scientifique (CNRS)-Sorbonne Université - UFR d'Ingénierie (UFR 919), Sorbonne Université (SU)-Sorbonne Université (SU)-Université Paris-Saclay-Université Paris-Sud - Paris 11 (UP11), IEEE |
Rok vydání: | 2015 |
Předmět: |
0209 industrial biotechnology
Facial expression Social robot InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g. HCI) Computer science 05 social sciences emotion 02 engineering and technology 050105 experimental psychology Human–robot interaction Handshaking HRI 020901 industrial engineering & automation haptics Human–computer interaction Joint stiffness medicine Robot [INFO]Computer Science [cs] 0501 psychology and cognitive sciences medicine.symptom Humanoid robot Simulation Haptic technology |
Zdroj: | HRI IEEE International Conference on Human-Robot Interaction IEEE International Conference on Human-Robot Interaction, IEEE, Mar 2015, Portland, United States |
Popis: | International audience; Robots are more and more present in our daily lives. In human- robot interaction, a social intelligent robot should be capable of understanding the context of interaction with the human so as to be- have in a proper manner by following some social rules. This paper focuses on the haptic affective social interaction during a greeting handshaking between the human and the robot. The main goal of this work is to study how the haptic feedback involved during the human-robot handshake can convey emotions, and more precisely, how it can influence the perception of emotions expressed through the facial expressions of the robot. Moreover, we examine the bene- fits of the multimodality (i.e., visuo-haptic) over the monomodality (i.e., visual-only and haptic-only). The experimental results with Meka robot show that the multimodal (i.e., visuo-haptic) condition presenting high values for grasping force and stiffness of move- ment are evaluated with higher values for the arousal and domi- nance dimensions than during the visual condition. Furthermore, the analysis of the results corresponding to the monomodal haptic condition showed that participants discriminate well the dominance and the arousal dimensions of the haptic behaviours presenting low and high values for grasping force and stiffness of movement. |
Databáze: | OpenAIRE |
Externí odkaz: |