Autor: |
Wei Liu, Jiawei Xu, Zihui Guo, Erzhu Li, Xing Li, Lianpeng Zhang, Wensong Liu |
Jazyk: |
angličtina |
Rok vydání: |
2021 |
Předmět: |
|
Zdroj: |
IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol 14, Pp 2236-2248 (2021) |
Druh dokumentu: |
article |
ISSN: |
2151-1535 |
DOI: |
10.1109/JSTARS.2021.3052495 |
Popis: |
As the manual detection of building footprint is inefficient and labor-intensive, this study proposed a method of building footprint extraction and change detection based on deep convolutional neural networks. The study modified the existing U-Net model to develop the “PRU-Net” model. PRU-Net incorporates pyramid scene parsing (PSP) to allow multiscale scene parsing, a residual block (RB) in ResNet for feature extraction, and focal loss to address sample imbalance. Within the proposed method, building footprint extraction is conducted as follows: 1) unmanned aerial vehicle images are cropped, denoised, and semantically marked, and datasets are created (including training/validation and prediction datasets); 2) the training/validation and prediction datasets are input into the full convolutional neural network PRU-Net for model training/validation and prediction. Compared with the U-Net, PSP+U-Net (PU-Net), and U-Net++ models, PRU-Net offers improved footprint extraction of buildings with a range of sizes and shapes. The large-scale experimental results demonstrated the effectiveness of the PSP module for multiscale scene analysis and the RB module for feature extraction. After demonstrating the improvements in building extraction offered by PRU-Net, the building footprint results were further processed to generate a building change map. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|