Dynahead-YOLO-Otsu: an efficient DCNN-based landslide semantic segmentation method using remote sensing images

Autor: Zheng Han, Bangjie Fu, Zhenxiong Fang, Yange Li, Jiaying Li, Nan Jiang, Guangqi Chen
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Geomatics, Natural Hazards & Risk, Vol 15, Iss 1 (2024)
Druh dokumentu: article
ISSN: 19475705
1947-5713
1947-5705
04932439
DOI: 10.1080/19475705.2024.2398103
Popis: Recent advancements in deep convolutional neural networks (DCNNs) have significantly improved landslides identification using remote sensing images. Pixel-wise semantic segmentation (PSS) and object-oriented detection (OOD) are two dominant approaches, wherein PSS are better as providing detailed delineation of landslide shapes. However, PSS are limited by the difficulty in labelling training data and low segmentation speed compared to OOD. In this paper, we propose an efficient DCNN-based landslide semantic segmentation method, the so-called Dynahead-YOLO-Otsu, to perform a PSS based on the OOD results. This is achieved by locating potential landslide regions in advance using the ODD-based Dynahead-YOLO model, which enhances the capacity for detecting landslides with variable proportions and complex background in the images. The preliminary results are then processed using the Otsu binarization algorithm to cluster pixels belonging to landslides from the images of potential regions for semantic segmentation. To validate the performance, we tested the proposed method using an open-source dataset containing 950 landslide images. We compared the results with three up-to-date DCNN-and PSS- based approaches, namely DeepLab v3+, PSPnet, and Unet. Results demonstrate that the proposed method achieves comparable Recall (71.80%) and F1 scores (75.80%), with an average improvement of 22% and 16% in Precision and IoU, respectively.
Databáze: Directory of Open Access Journals