Human hand recognition from robotic skin measurements in human-robot physical interactions
Autor: | Giorgio Cannata, Simone Denei, Alessandro Albini |
---|---|
Rok vydání: | 2017 |
Předmět: |
0209 industrial biotechnology
Contextual image classification Computer science business.industry 010401 analytical chemistry Computer Science Applications1707 Computer Vision and Pattern Recognition 02 engineering and technology 01 natural sciences Convolutional neural network Motion (physics) Human–robot interaction 0104 chemical sciences 020901 industrial engineering & automation Control and Systems Engineering Bag-of-words model in computer vision Software 1707 Robot Computer vision Artificial intelligence Representation (mathematics) business |
Zdroj: | IROS |
DOI: | 10.1109/iros.2017.8206300 |
Popis: | This paper deals with the problem of using the tactile feedback generated by a robotic skin for discriminating a human hand touch from a generic contact. Humans understand collaboration intentions through different sensing modalities such as vision, hearing and touch. Among them, a physical interaction is mainly used for demonstrating or correcting a kind of motion and is usually started by touching with the hands the other human body. Until recently, it was difficult to perform the same in human-robot cooperation due to the lack of large-scale tactile systems functionally similar to a human skin. Our approach consists in transforming measurements of sensors distributed on the robot body into a convenient 2D representation of the contact shape, i.e., a contact image, then applying image classification techniques in order to discriminate a human touch from unexpected collisions. Experiments have been performed on a robotic skin composed of 768 pressure sensors integrated on a Baxter robot forearm. More than 1800 contact images have been generated from 43 different persons for training and testing two machine learning algorithms: Bag of Visual Words and Convolutional Neural Networks. The experimental results show that both approaches are valid, obtaining a classification accuracy higher than 96%. |
Databáze: | OpenAIRE |
Externí odkaz: |