Autor: |
Bingsheng Li, Na Li, Jianmin Ren, Xupeng Guo, Chao Liu, Hao Wang, Qingwu Li |
Jazyk: |
angličtina |
Rok vydání: |
2024 |
Předmět: |
|
Zdroj: |
Electronic Research Archive, Vol 32, Iss 7, Pp 4218-4236 (2024) |
Druh dokumentu: |
article |
ISSN: |
2688-1594 |
DOI: |
10.3934/era.2024190?viewType=HTML |
Popis: |
Although the data fusion of hyperspectral images (HSI) and light detection and ranging (LiDAR) has provided significant gains for land-cover classification, it also brings technical obstacles (i.e., it is difficult to capture discriminative local and global spatial-spectral from redundant data and build interactions between heterogeneous data). In this paper, a classification network named enhanced spectral attention and adaptive spatial learning guided network (ESASNet) is proposed for the joint use of HSI and LiDAR. Specifically, first, by combining a convolutional neural network (CNN) with the transformer, adaptive spatial learning (ASL) and enhanced spectral learning (ESL) are proposed to learn the spectral-spatial features from the HSI data and the elevation features from the LiDAR data in the local and global receptive field. Second, considering the characteristics of HSI with a continuous, narrowband spectrum, ESL is designed by adding enhanced local self-attention to enhance the mining of the spectral correlations across the adjacent spectrum. Finally, a feature fusion module is proposed to ensure an efficient information exchange between HSI and LiDAR during spectral features and spatial feature fusion. Experimental evaluations on the HSI-LiDAR dataset clearly illustrate that ESASNet performs better in feature extraction than the state-of-the-art methods. The code is available at https://github.com/AirsterMode/ESASNet. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|