Autor: |
Zeqian Yi, Xinghui Zhu, Runbing Wu, Zhuoyang Zou, Yi Liu, Lei Zhu |
Jazyk: |
angličtina |
Rok vydání: |
2023 |
Předmět: |
|
Zdroj: |
Applied Sciences, Vol 14, Iss 1, p 93 (2023) |
Druh dokumentu: |
article |
ISSN: |
2076-3417 |
DOI: |
10.3390/app14010093 |
Popis: |
Due to the low storage cost and high computation efficiency of hashing, cross-modal hashing has been attracting widespread attention in recent years. In this paper, we investigate how supervised cross-modal hashing (CMH) benefits from multi-label and contrastive learning (CL) by overcoming the following two challenges: (i) how to combine multi-label and supervised contrastive learning to consider diverse relationships among cross-modal instances, and (ii) how to reduce the sparsity of multi-label representation so as to improve the similarity measurement accuracy. To this end, we propose a novel cross-modal hashing framework, dubbed Multi-Label Weighted Contrastive Hashing (MLWCH). This framework involves compact consistent similarity representation, a new designed multi-label similarity calculation method that efficiently reduces the sparsity of multi-label by reducing redundant zero elements. Furthermore, a novel multi-label weighted contrastive learning strategy is developed to significantly improve hashing learning by assigning similarity weight to positive samples under both linear and non-linear similarities. Extensive experiments and ablation analysis over three benchmark datasets validate the superiority of our MLWCH method, especially over several outstanding baselines. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|