Parameter Adaptive Contrastive Hashing for multimedia retrieval.
Autor: | Chen Y; Big Data Institute, School of Computer Science and Engineering, Central South University, ChangSha, Hunan, 410000, China. Electronic address: yunfeichen@csu.edu.cn., Long Y; Data Science Institute, Vanderbilt University, Nashville, TN, 37235, United States of America. Electronic address: yitian.long@vanderbilt.edu., Yang Z; Big Data Institute, School of Computer Science and Engineering, Central South University, ChangSha, Hunan, 410000, China. Electronic address: zyang22@csu.edu.cn., Long J; Big Data Institute, School of Computer Science and Engineering, Central South University, ChangSha, Hunan, 410000, China. Electronic address: junlong@csu.edu.cn. |
---|---|
Jazyk: | angličtina |
Zdroj: | Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2025 Feb; Vol. 182, pp. 106923. Date of Electronic Publication: 2024 Nov 20. |
DOI: | 10.1016/j.neunet.2024.106923 |
Abstrakt: | With the emergence of massive amounts of multi-source heterogeneous data on the Internet, quickly retrieving effective information from this extensive data has become a hot research topic. Due to the efficiency and speed of hash learning methods in multimedia retrieval, they have become a mainstream method for multimedia retrieval. However, unsupervised multimedia hash learning methods still face challenges with the difficulties of tuning due to the excessive number of hyperparameters and the lack of precise guidance on semantic similarity. To address these problems, we propose a Parameter Adaptive Contrastive Hashing (PACH) method for multimedia retrieval. The Fast Parameter Adaptive (FPA) module, combined with the powerful space exploration and dynamic optimization capabilities of reinforcement learning, designs a hot-plugging multimedia hashing method parameter adaptation module to solve for an approximate optimal combination of parameters. The Multimedia Contrastive Hashing (MCH) module comprehensively explores intra- and inter-modal semantic consistency of multimodal data, enriching the cross-modal semantic information of the hash codes. Comprehensive experiments were designed and compared with the latest hash learning methods, verifying the effectiveness and superiority of the PACH method. The code is available at https://github.com/YunfeiChenMY/PACH. Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. (Copyright © 2024 Elsevier Ltd. All rights reserved.) |
Databáze: | MEDLINE |
Externí odkaz: |