Autor: |
Yang, Yueguang, Qu, Jiahui, Dong, Wenqian, Zhang, Tongzhen, Xiao, Song, Li, Yunsong |
Zdroj: |
IEEE Transactions on Geoscience and Remote Sensing; 2024, Vol. 62 Issue: 1 p1-15, 15p |
Abstrakt: |
The joint classification of hyperspectral images (HSIs) and LiDAR data plays a crucial role in Earth observation missions. Most advanced methods are based on discrete label supervision. However, since discrete labels only convey limited information that a sample belongs to a single definite class and lack of prior information, it is difficult to supervise the model to capture rich inherent semantic information in complex data distributions, hindering the classification performance. To this end, we propose a text-supervised multidimensional contrastive fusion network (TMCFN), which leverages class text information to guide the learning of visual representations while establishing a semantic association of text and visual features for classification using multidimensionally incorporated contrastive learning (CL) paradigms. Specifically, TMCFN is composed of text information encoding (TIE), visual features representation (VFR), and text–visual features alignment and classification (TVFAC). TIE is employed to extract semantic information from class text extended from class names, intrinsic attributes and inter-class relationships. VFR mainly comprises a new fusion-based contrastive feature learning module (FCFLM) to extract discriminative visual features and a text-guided attention feature fusion module (TAF $^{2}\text{M}$ ) to fuse visual features under the guidance of text information. TVFAC optimizes the learning of visual features under the supervision of text information while using a CL paradigm to align text and visual features for establishing the semantic association, and achieves the classification by directly computing the similarity between the visual features and each text feature without an additional classifier. Experiments with three standard datasets verify the effectiveness of TMCFN. |
Databáze: |
Supplemental Index |
Externí odkaz: |
|