DPANet: Dual Pooling‐aggregated Attention Network for fish segmentation
Autor: | Zhenshan Bao, Wenbo Zhang, Chaoyi Wu |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2022 |
Předmět: |
Computer science
business.industry Pooling Computer applications to medicine. Medical informatics R858-859.7 Pattern recognition Image segmentation DUAL (cognitive architecture) computer vision QA76.75-76.765 convolutional neural nets Attention network Fish Segmentation learning (artificial intelligence) Computer Vision and Pattern Recognition Artificial intelligence Computer software business image segmentation Software |
Zdroj: | IET Computer Vision, Vol 16, Iss 1, Pp 67-82 (2022) |
ISSN: | 1751-9632 1751-9640 |
Popis: | The sustainable development of marine fisheries depends on the accurate measurement of data on fish stocks. Semantic segmentation methods based on deep learning can be applied to automatically obtain segmentation masks of fish in images to obtain measurement data. However, general semantic segmentation methods cannot accurately segment fish objects in underwater images. In this study, a Dual Pooling‐aggregated Attention Network (DPANet) to adaptively capture long‐range dependencies through an efficient and computing‐friendly manner to enhance feature representation and improve segmentation performance is proposed. Specifically, a novel pooling‐aggregate position attention module and a pooling‐aggregate channel attention module are designed to aggregate contexts in the spatial dimension and channel dimension, respectively. These two modules adopt pooling operations along the channel dimension and along the spatial dimension to aggregate information, respectively, thus reducing computational costs. In these modules, attention maps are generated by four different paths and are aggregated into one. The authors conduct extensive experiments to validate the effectiveness of the DPANet and achieve new state‐of‐the‐art segmentation performance on the well‐known fish image dataset DeepFish as well as on the underwater image dataset SUIM, achieving a Mean IoU score of 91.08% and 85.39% respectively, while significantly reducing FLOPs of attention modules by about 93%. |
Databáze: | OpenAIRE |
Externí odkaz: |