Dense Attentive Feature Enhancement for Salient Object Detection
Autor: | Jiashi Feng, Jian Zhao, Congyan Lang, Qibin Hou, Zun Li, Songhe Feng, Liqian Liang |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | IEEE Transactions on Circuits and Systems for Video Technology. 32:8128-8141 |
ISSN: | 1558-2205 1051-8215 |
DOI: | 10.1109/tcsvt.2021.3102944 |
Popis: | Attention mechanisms have been proven highly effective for salient object detection. Most previous works utilize attention as a self-gated module to reweigh the feature maps at different levels independently. However, they are limited to certain-level guidance and could not satisfy the need of both accurately detecting intact objects and maintaining their detailed boundaries. In this paper, we build dense attention upon features from multiple levels simultaneously and propose a novel Dense Attentive Feature Enhancement (DAFE) module for efficient feature enhancement in saliency detection. DAFE stacks several attentional units and densely connects attentive feature output from current unit to its all subsequent units. This allows feature maps at deep units to absorb attentive information from shallow units, thus more discriminative information can be efficiently selected at the final output. Note that DAFE is plug and play, which can be effortlessly inserted into any saliency or video saliency models for their performance improvements. We further instantiate a highly effective Dense Attentive Feature Enhancement Network (DAFE-Net) for accurate salient object detection. DAFE-Net constructs DAFE over the aggregation feature that contains both semantics and saliency details, the entire salient objects and their boundaries can be well retained through dense attentions. Extensive experiments demonstrate that the proposed DAFE module is highly effective, and the DAFE-Net performs favorably compared with state-of-the-art approaches. |
Databáze: | OpenAIRE |
Externí odkaz: |