Transformer fusion-based scale-aware attention network for multispectral victim detection

Autor: Yunfan Chen, Yuting Li, Wenqi Zheng, Xiangkui Wan
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Complex & Intelligent Systems, Vol 10, Iss 5, Pp 6619-6632 (2024)
Druh dokumentu: article
ISSN: 2199-4536
2198-6053
DOI: 10.1007/s40747-024-01515-y
Popis: Abstract The aftermath of a natural disaster leaves victims trapped in rubble which is challenging to detect by smart drones due to the victims in low visibility under the adverse disaster environments and victims in various sizes. To overcome the above challenges, a transformer fusion-based scale-aware attention network (TFSANet) is proposed to overcome adverse environmental impacts in disaster areas by robustly integrating the latent interactions between RGB and thermal images and to address the problem of various-sized victim detection. Firstly, a transformer fusion model is developed to incorporate a two-stream backbone network to effectively fuse the complementary characteristics between RGB and thermal images. This aims to solve the problem that the victims cannot be seen clearly due to the adverse disaster area, such as smog and heavy rain. In addition, a scale-aware attention mechanism is designed to be embedded into the head network to adaptively adjust the size of receptive fields aiming to capture victims with different scales. Extensive experiments on two challenging datasets indicate that our TFSANet achieves superior results. The proposed method achieves 86.56% average precision (AP) on the National Institute of Informatics—Chiba University (NII-CU) multispectral aerial person detection dataset, outperforming the state-of-the-art approach by 4.38%. On the drone-captured RGBT person detection (RGBTDronePerson) dataset, the proposed method significantly improves the AP of the state-of-the-art approach by 4.33%.
Databáze: Directory of Open Access Journals