EC2Net: Efficient Attention-Based Cross-Context Network for Near Real-Time Salient Object Detection

Autor: Ngo Thien Thu, Md. Delowar Hossain, Eui-Nam Huh
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: IEEE Access, Vol 11, Pp 39845-39854 (2023)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2023.3268114
Popis: The development of salient object detection is crucial in ubiquitous applications. Existing state-of-the-art models tend to have complex designs and a significant number of parameters, prioritizing performance improvement over efficiency. Hence, there pose significant challenges to deploying them in edge devices. The intricacy in these models stems from the complicated encoder-decoder that aims to effectively generate and integrate coarse and semantic features. To address this problem, we introduced EC2Net, an efficient attention-based cross-context network for salient object detection. To start with, we introduce the shallow crossed-context aggregation (SCCA) mechanism to enhance and preserve object boundaries for shallow layers. We introduced a deep cross-context aggregation (DCCA) mechanism to enhance semantic features in deep layers. Subsequently, we introduced the dual cross-fusion module (DCFM) to efficiently merge shallow and deep features. The proposed modules complement each other, enabling EC2Net to accurately detect salient objects with reduced computational overhead. Through experiments on five standard datasets, the proposed method demonstrated competitive performance while utilizing fewer parameters, FLOPS, and memory storage than other resource-intensive models.
Databáze: Directory of Open Access Journals