Entropy information‐based heterogeneous deep selective fused features using deep convolutional neural network for sketch recognition

Autor: Shaukat Hayat, She Kun, Sara Shahzad, Parinya Suwansrikham, Muhammad Mateen, Yao Yu
Jazyk: angličtina
Rok vydání: 2021
Předmět:
Zdroj: IET Computer Vision, Vol 15, Iss 3, Pp 165-180 (2021)
Druh dokumentu: article
ISSN: 1751-9640
1751-9632
DOI: 10.1049/cvi2.12019
Popis: Abstract An effective feature representation can boost recognition tasks in the sketch domain. Due to an abstract and diverse structure of the sketch relatively with a natural image, it is complex to generate a discriminative features representation for sketch recognition. Accordingly, this article presents a novel scheme for sketch recognition. It generates a discriminative features representation as a result of integrating asymmetry essential information from deep features. This information is kept as an original feature‐vector space for making a final decision. Specifically, five different well‐known pre‐trained deep convolutional neural networks (DCNNs), namely, AlexNet, VGGNet‐19, Inception V3, Xception, and InceptionResNetV2 are fine‐tuned and utilised for feature extraction. First, the high‐level deep layers of the networks were used to get multi‐features hierarchy from sketch images. Second, an entropy‐based neighbourhood component analysis was employed to optimise the fusion of features in order of rank from multiple different layers of various deep networks. Finally, the ranked features vector space was fed into the support vector machine (SVM) classifier for sketch classification outcomes. The performance of the proposed scheme is evaluated on two different sketch datasets such as TU‐Berlin and Sketchy for classification and retrieval tasks. Experimental outcomes demonstrate that the proposed scheme brings substantial improvement over human recognition accuracy and other state‐of‐the‐art algorithms.
Databáze: Directory of Open Access Journals