Cross-Modality Interaction Network for Equine Activity Recognition Using Imbalanced Multi-Modal Data

Autor: Endai Huang, Weitao Xu, Axiu Mao, Kai Liu, Haiming Gan, Rebecca S. V. Parkes
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Zdroj: Sensors, Vol 21, Iss 5818, p 5818 (2021)
Sensors (Basel, Switzerland)
Sensors
Volume 21
Issue 17
ISSN: 1424-8220
Popis: With the recent advances in deep learning, wearable sensors have increasingly been used in automated animal activity recognition. However, there are two major challenges in improving recognition performance—multi-modal feature fusion and imbalanced data modeling. In this study, to improve classification performance for equine activities while tackling these two challenges, we developed a cross-modality interaction network (CMI-Net) involving a dual convolution neural network architecture and a cross-modality interaction module (CMIM). The CMIM adaptively recalibrated the temporal- and axis-wise features in each modality by leveraging multi-modal information to achieve deep intermodality interaction. A class-balanced (CB) focal loss was adopted to supervise the training of CMI-Net to alleviate the class imbalance problem. Motion data was acquired from six neck-attached inertial measurement units from six horses. The CMI-Net was trained and verified with leave-one-out cross-validation. The results demonstrated that our CMI-Net outperformed the existing algorithms with high precision (79.74%), recall (79.57%), F1-score (79.02%), and accuracy (93.37%). The adoption of CB focal loss improved the performance of CMI-Net, with increases of 2.76%, 4.16%, and 3.92% in precision, recall, and F1-score, respectively. In conclusion, CMI-Net and CB focal loss effectively enhanced the equine activity classification performance using imbalanced multi-modal sensor data.
Databáze: OpenAIRE