Autor: |
Bohua Chen, Hanzhi Ma, Junjie He, Yinzhang Ding, Lianghao Wang, Dongxiao Li, Ming Zhang |
Jazyk: |
angličtina |
Rok vydání: |
2019 |
Předmět: |
|
Zdroj: |
IEEE Access, Vol 7, Pp 144756-144765 (2019) |
Druh dokumentu: |
article |
ISSN: |
2169-3536 |
DOI: |
10.1109/ACCESS.2019.2944925 |
Popis: |
Recent work has shown that self-attention modules improve the performance of convolutional neural networks (CNNs), in which global operations are conventionally used to generate descriptors from feature context for attention calculation and characteristics recalibration. However, the performance gain is compromised due to sharing the same descriptor for different feature context. In this paper, we propose Pyramid Attention Mechanism (PAM) that incorporates contextual reasoning into self-attention module for enhancing the discriminative ability of descriptors. PAM is lightweight yet efficient and can be integrated with most self-attention modules. It consists of two operators: aggregation and distribution, which are used for assembling and synthesizing contextual information at different levels. Extensive experiments on different benchmarks (including CIFAR-100, ImageNet-1K, MS COCO, and VOC 2007) indicate that PAM can produce competitive performance gains. In classification tasks, by plugging PAM into self-attention modules, at most 2.18% accuracy improvement over various network structures can be obtained. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|