Constructing a mobile visual search framework for Dunhuang murals based on fine-tuned CNN and ontology semantic distance

Autor: Ziming Zeng, Shouqiang Sun, Jingjing Sun, Jie Yin, Yueyan Shen
Rok vydání: 2022
Předmět:
Zdroj: The Electronic Library. 40:121-139
ISSN: 0264-0473
DOI: 10.1108/el-09-2021-0173
Popis: Purpose Dunhuang murals are rich in cultural and artistic value. The purpose of this paper is to construct a novel mobile visual search (MVS) framework for Dunhuang murals, enabling users to efficiently search for similar, relevant and diversified images. Design/methodology/approach The convolutional neural network (CNN) model is fine-tuned in the data set of Dunhuang murals. Image features are extracted through the fine-tuned CNN model, and the similarities between different candidate images and the query image are calculated by the dot product. Then, the candidate images are sorted by similarity, and semantic labels are extracted from the most similar image. Ontology semantic distance (OSD) is proposed to match relevant images using semantic labels. Furthermore, the improved DivScore is introduced to diversify search results. Findings The results illustrate that the fine-tuned ResNet152 is the best choice to search for similar images at the visual feature level, and OSD is the effective method to search for the relevant images at the semantic level. After re-ranking based on DivScore, the diversification of search results is improved. Originality/value This study collects and builds the Dunhuang mural data set and proposes an effective MVS framework for Dunhuang murals to protect and inherit Dunhuang cultural heritage. Similar, relevant and diversified Dunhuang murals are searched to meet different demands.
Databáze: OpenAIRE