Detecting and Classifying Human Touches in a Social Robot Through Acoustic Sensing and Machine Learning

Autor: Fernando Alonso-Martín, Juan José Gamboa-Montero, José Carlos Castillo, Miguel A. Salichs, Álvaro Castro-González
Rok vydání: 2017
Předmět:
0209 industrial biotechnology
Engineering
Microphone
touch interaction
Robótica e Informática Industrial
02 engineering and technology
Machine learning
computer.software_genre
Biochemistry
Robot learning
Acoustic sensing
Article
Analytical Chemistry
Machine Learning
human-robot interaction
020901 industrial engineering & automation
contact microphone
0202 electrical engineering
electronic engineering
information engineering

Humans
Computer vision
Electrical and Electronic Engineering
Instrumentation
Social robot
acoustic sensing
Gestures
business.industry
Robotics
Acoustics
Atomic and Molecular Physics
and Optics

machine learning
Gesture recognition
Touch
Contact microphone
Robot
020201 artificial intelligence & image processing
Artificial intelligence
Touch interaction
business
Human-robot interaction
computer
Gesture
Zdroj: e-Archivo. Repositorio Institucional de la Universidad Carlos III de Madrid
instname
Sensors (Basel, Switzerland)
Sensors; Volume 17; Issue 5; Pages: 1138
DOI: 10.3390/s17051138
Popis: An important aspect in Human-Robot Interaction is responding to different kinds of touch stimuli. To date, several technologies have been explored to determine how a touch is perceived by a social robot, usually placing a large number of sensors throughout the robot's shell. In this work, we introduce a novel approach, where the audio acquired from contact microphones located in the robot's shell is processed using machine learning techniques to distinguish between different types of touches. The system is able to determine when the robot is touched (touch detection), and to ascertain the kind of touch performed among a set of possibilities: stroke, tap, slap, and tickle (touch classification). This proposal is cost-effective since just a few microphones are able to cover the whole robot's shell since a single microphone is enough to cover each solid part of the robot. Besides, it is easy to install and configure as it just requires a contact surface to attach the microphone to the robot's shell and plug it into the robot's computer. Results show the high accuracy scores in touch gesture recognition. The testing phase revealed that Logistic Model Trees achieved the best performance, with an F-score of 0.81. The dataset was built with information from 25 participants performing a total of 1981 touch gestures. The research leading to these results has received funding from the projects: Development of social robots to help seniors with cognitive impairment (ROBSEN), funded by the Ministerio de Economia y Competitividad; and RoboCity2030-III-CM, funded by Comunidad de Madrid and cofunded by Structural Funds of the EU. Publicado
Databáze: OpenAIRE