Abstrakt: |
Humans sense and interpret touches on their skin for social interaction. Similarly, robotic systems with human-like robotic skin can intuitively interact with humans. Therefore, many tactile sensors have been developed, but their excessive sensing elements, narrow sensing bandwidth, and fragility limit their applications as robotic skin. This article proposes a robotic skin structure that mimics the human skin layer and Pacinian corpuscle using a textured resilient fabric, a uniquely structured airmesh, and encapsulated microphones. Furthermore, functions such as tactile stimulus encoding, tactile stimulus dispersion, elastomechanical properties, and wide sensitivity bandwidth are mimicked. The developed skin identifies tactile locations and patterns using a small number of sensing nodes and algorithms that interpret tactile sensations. These include passive acoustic tomography to localize touch, signal intensities map, and spectrogram to encode spatiotemporal characteristics of touch, and convolutional neural network to decode and classify touch. As a result, the algorithms localized tactile stimulus with a mean error of 1.8 cm and classified touch into nine classes with an accuracy of 93.3%. Furthermore, the developed robotic skin has no rigid material and employs a few sensing nodes, thus easily accommodating large nonplanar surfaces. The skin was implemented on a robotic arm to demonstrate a physical human–robot interaction and on a vertically cylindrical surface of similar size to a social robot to demonstrate the scaling up to larger systems with a lower sensing node density. |