Detecting, locating and recognising human touches in social robots with contact microphones
Autor: | María Malfaz, Juan José Gamboa-Montero, Fernando Alonso-Martín, José Carlos Castillo, Miguel A. Salichs |
---|---|
Přispěvatelé: | Comunidad de Madrid, Ministerio de Ciencia, Innovación y Universidades (España) |
Jazyk: | angličtina |
Rok vydání: | 2020 |
Předmět: |
0209 industrial biotechnology
Machine learning applications InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g. HCI) Computer science media_common.quotation_subject Robótica e Informática Industrial 02 engineering and technology Social robots Touch gesture recognition Acoustic sensing Touch localisation 020901 industrial engineering & automation Artificial Intelligence Human–computer interaction 0202 electrical engineering electronic engineering information engineering Natural (music) Meaning (existential) Electrical and Electronic Engineering media_common Social robot Sadness Control and Systems Engineering Robot 020201 artificial intelligence & image processing Human-robot interaction Gesture |
Zdroj: | e-Archivo. Repositorio Institucional de la Universidad Carlos III de Madrid instname |
Popis: | There are many situations in our daily life where touch gestures during natural human–human interaction take place: meeting people (shaking hands), personal relationships (caresses), moments of celebration or sadness (hugs), etc. Considering that robots are expected to form part of our daily life in the future, they should be endowed with the capacity of recognising these touch gestures and the part of its body that has been touched since the gesture’s meaning may differ. Therefore, this work presents a learning system for both purposes: detect and recognise the type of touch gesture (stroke, tickle, tap and slap) and its localisation. The interpretation of the meaning of the gesture is out of the scope of this paper. Different technologies have been applied to perceive touch by a social robot, commonly using a large number of sensors. Instead, our approach uses 3 contact microphones installed inside some parts of the robot. The audio signals generated when the user touches the robot are sensed by the contact microphones and processed using Machine Learning techniques. We acquired information from sensors installed in two social robots, Maggie and Mini (both developed by the RoboticsLab at the Carlos III University of Madrid), and a real-time version of the whole system has been deployed in the robot Mini. The system allows the robot to sense if it has been touched or not, to recognise the kind of touch gesture, and its approximate location. The main advantage of using contact microphones as touch sensors is that by using just one, it is possible to “cover” a whole solid part of the robot. Besides, the sensors are unaffected by ambient noises, such as human voice, TV, music etc. Nevertheless, the fact of using several contact microphones makes possible that a touch gesture is detected by all of them, and each may recognise a different gesture at the same time. The results show that this system is robust against this phenomenon. Moreover, the accuracy obtained for both robots is about 86%. The research leading to these results has received funding from the projects: ‘‘Robots Sociales para Estimulación Física, Cognitiva y Afectiva de Mayores (ROSES)’’, funded by the Spanish "Ministerio de Ciencia, Innovación y Universidades, Spain" and from RoboCity2030-DIH-CM, Madrid Robotics Digital Innovation Hub, S2018/NMT-4331, funded by ‘"Programas de Actividades I+D en la Comunidad de Madrid’" and cofunded by Structural Funds of the EU, Slovak Republic. Publicado |
Databáze: | OpenAIRE |
Externí odkaz: |