Modified motion influence map and recurrent neural network-based monitoring of the local unusual behaviors for fish school in intensive aquaculture
Autor: | Zhangying Ye, Huanda Lu, Bao Weijun, Songming Zhu, Mingwei Shen, Ying Liu, Zhang Fengdeng, Jian Zhao |
---|---|
Rok vydání: | 2018 |
Předmět: |
0301 basic medicine
Information transfer Basis (linear algebra) Artificial neural network business.industry Visibility (geometry) Pattern recognition 04 agricultural and veterinary sciences Aquatic Science Biology Tracking (particle physics) Motion (physics) 03 medical and health sciences 030104 developmental biology Recurrent neural network 040102 fisheries 0401 agriculture forestry and fisheries Segmentation Artificial intelligence business |
Zdroj: | Aquaculture. 493:165-175 |
ISSN: | 0044-8486 |
DOI: | 10.1016/j.aquaculture.2018.04.064 |
Popis: | Aiming at the accurate monitoring of the local unusual behaviors of fish school in intensive aquaculture, a novel and practical method, mainly based on the modified motion influence map and recurrent neural network (RNN), was proposed in this study to detect, localize and recognize the local unusual behaviors systematically. First, the motion characteristics of the whole fish school were extracted using particle advection scheme, without tracking and foreground segmentation. Secondly, the modified motion influence map representing the interaction characteristics within fish school was constructed by considering the factors of speed, direction, distance, visibility and the information transfer modes. Then, on the basis of the constructed motion influence map, the detection and localization of the local unusual behaviors were realized simultaneously with the help of “minimum-distance matrix” framework. Finally, by means of the customized RNN, the localized unusual behaviors were recognized after the further quantification. Through the test on behavior dataset consisting of three typically local unusual behaviors, the performance of the presented method was verified (accuracy: 98.91%, 91.67% and 89.89% of detection, localization and recognition, respectively) and better than many other state-of-the-art methods. |
Databáze: | OpenAIRE |
Externí odkaz: |