Popis: |
Social robots coexist with humans in situations where they have to exhibit proper communication skills. Since users may have different features and communicative procedures, personalizing human-robot interactions is essential for the success of these interactions. This manuscript presents Active Learning based on computer vision and human-robot interaction for user recognition and profiling to personalize robot behavior. The system identifies people using Intel-face-detection-retail-004 and FaceNet for face recognition and obtains users" information through interaction. The system aims to improve human-robot interaction by (i) using online learning to allow the robot to identify the users and (ii) retrieving users' information to fill out their profiles and adapt the robot's behavior. Since user information is necessary for adapting the robot for each interaction, we hypothesized that users would consider creating their profile by interacting with the robot more entertaining and easier than taking a survey. We validated our hypothesis with three scenarios: the participants completed their profiles using an online survey, by interacting with a dull robot, or with a cheerful robot. The results show that participants gave the cheerful robot a higher usability score (82.14/100 points), and they were more entertained while creating their profiles with the cheerful robot than in the other scenarios. Statistically significant differences in the usability were found between the scenarios using the robot and the scenario that involved the online survey. Finally, we show two scenarios in which the robot interacts with a known user and an unknown user to demonstrate how it adapts to the situation. The research leading to these results has received funding from the projects: Robots Sociales para Estimulación Física, Cognitiva y Afectiva de Mayores (ROSES), RTI2018-096338-B-I00, funded by the Spain Ministry of Science, Innovation and Universities; Robots sociales para mitigar la soledad y el aislamiento en mayores (SOROLI), PID2021-123941OA-I00, funded by Agencia Estatal de Investigación (AEI), Spain Ministry of Science and Innovation. This publication is part of the R&D&I project PLEC2021-007819 funded by MCIN/AEI/10.13039/5011000-11033 and by the European Union NextGenerationEU/PRTR. |