Color Image Invariant Feature Extraction by a Topological Property Motivated PCNN Model
Autor: | Jun Yue, Zhiwang Zhang, Guangjie Kou, Yunyan Ma |
---|---|
Jazyk: | angličtina |
Rok vydání: | 2019 |
Předmět: |
Topological property
Normalization (statistics) Visual perception General Computer Science Computer science Invariant feature extraction Feature extraction Residual spectral residual approach Entropy (information theory) General Materials Science Saliency map Pulse coupled neural network Invariant (mathematics) Artificial neural network Color image business.industry General Engineering Pattern recognition saliency map invariant feature extraction topological perception theory Artificial intelligence lcsh:Electrical engineering. Electronics. Nuclear engineering business lcsh:TK1-9971 |
Zdroj: | IEEE Access, Vol 7, Pp 149649-149656 (2019) |
ISSN: | 2169-3536 |
Popis: | Topological invariant features take priority over other vision features in early visual perception stage, which is the core idea of topological perception theory. In order to improve the robustness and distinguishability of the invariant features extracted by pulse coupled neural network (PCNN), the topological properties are integrated into PCNN. The improved PCNN model is called as topological property motivated PCNN (TPCNN), which adopts the saliency map calculated by the spectral residual approach as the important topological properties (the connectivity, and the number of holes in target). In TPCNN, firstly, the normalized saliency map is used as a linking coefficient to enhance the importance of saliency object when we calculating the invariant features. Secondly, the entropy signature of the saliency map is treated as an additional new feature and merged into original features calculated by PCNN, then the final invariant feature is obtained. The proposed TPCNN is used to calculate the invariant feature of different kinds of fish in the paper. Experimental results show that TPCNN outperforms the state-of-art models on invariant features extraction. |
Databáze: | OpenAIRE |
Externí odkaz: |