Transformation-invariant Gabor convolutional networks
Autor: | Feipeng Da, Lei Zhuang, Mengxiang Li, Shaoyan Gai |
---|---|
Rok vydání: | 2020 |
Předmět: |
Source code
Computer science business.industry media_common.quotation_subject Pooling 020206 networking & telecommunications Pattern recognition 02 engineering and technology Convolutional neural network Robustness (computer science) Signal Processing 0202 electrical engineering electronic engineering information engineering 020201 artificial intelligence & image processing Multimedia information systems Artificial intelligence Electrical and Electronic Engineering Invariant (mathematics) business media_common |
Zdroj: | Signal, Image and Video Processing. 14:1413-1420 |
ISSN: | 1863-1711 1863-1703 |
DOI: | 10.1007/s11760-020-01684-6 |
Popis: | Although deep convolutional neural networks (DCNNs) have powerful capability of learning complex feature representations, they are limited by poor ability in handling large rotations and scale transformations. In this paper, we propose a novel alternative to conventional convolutional layer named Gabor convolutional layer (GCL) to enhance the robustness to transformations. The GCL is a simple but efficient combination of Gabor prior knowledge and parameters learning. A GCL is composed of three components: Gabor extraction module, weight-sharing convolution module, and transformation pooling module, respectively. DCNNs integrated with GCLs, referred to as transformation-invariant Gabor convolutional networks (TI-GCNs), can be easily built by replacing standard convolutional layers with designed GCLs. Our experimental results on various real-world recognition tasks indicate that encoding traditional hand-crafted Gabor filters with dominant orientation and scale information into DCNNs is of great importance for learning compact feature representations and reinforcing the resistance to scale changes and orientation variations. The source code can be found at https://github.com/GuichenLv . |
Databáze: | OpenAIRE |
Externí odkaz: |