Edge and neighborhood guidance network for 2D medical image segmentation

Autor: Xiaodong Yang, Weiwei Cao, Jian Zheng, Dehui Xiang, Yakang Dai, Saisai Ding, Zhaobang Liu, Haotian Sun
Rok vydání: 2021
Předmět:
Zdroj: Biomedical Signal Processing and Control. 69:102856
ISSN: 1746-8094
Popis: Accurate automatic image segmentation is important in medical image analysis. A perfect segmentation using fully convolutional network (FCN) means an accurate classification of each pixel. However, it is still a great challenge to accurately differentiate edge pixels from neighborhood pixels in weak edge regions. Many previous segmentation methods have focused on edge information to mitigate weak edge problems, but the more important neighborhood information is undervalued. To tackle this problem, in this paper, we propose a novel yet effective Edge and Neighborhood Guidance Network (ENGNet). Specifically, instead of just utilizing the edge information as the shape constraints, the edge and neighborhood guidance (ENG) module is designed to exploit the edge information and fine-grained neighborhood spatial information simultaneously, so as to improve the ability of network to classify edge pixels and neighborhood pixels in weak edge regions. Moreover, the ENG modules are adopted in different scales to learn sufficient feature representations of edge and neighborhood. To extract complementary features more effectively in channel dimension, we also design a multi-scale adaptive selection (MAS) module at channel-wise to extract multi-scale context information and adaptively fuse different-scale features. Two 2D public segmentation datasets including skin lesion dataset and endoscopic polyp dataset are used to evaluate the performance of the proposed ENGNet. Experimental results demonstrated that by exploiting edge information and neighborhood spatial information in different scales simultaneously, the proposed ENGNet can effectively alleviate the misclassification in weak edge regions and achieve better performance than other state-of-the-art methods.
Databáze: OpenAIRE