Saliency detection via background and foreground null space learning
Autor: | Shuo Zhang, Xin-Gang Zhang, Ping Zhang, Ying Ying Zhang |
---|---|
Rok vydání: | 2019 |
Předmět: |
Computer science
business.industry Null (mathematics) ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION 020206 networking & telecommunications Pattern recognition 02 engineering and technology Image (mathematics) Transformation (function) Distance measurement Salient Signal Processing 0202 electrical engineering electronic engineering information engineering Key (cryptography) Fuse (electrical) Benchmark (computing) 020201 artificial intelligence & image processing Computer Vision and Pattern Recognition Artificial intelligence Electrical and Electronic Engineering business Software |
Zdroj: | Signal Processing: Image Communication. 70:271-281 |
ISSN: | 0923-5965 |
Popis: | In this paper, we present a novel bottom-up salient object detection approach by exploiting the relationship between the saliency detection and the null space learning. A key observation is that saliency of an image segment can be estimated by measuring the distance to the single point, which represents the background or foreground salient samples in the null spaces. We apply the null Foley–Sammon transformation to model the null spaces of the background samples or foreground salient samples, where the potentially large and complex intra-class variations of the samples are totally removed and the specific features of the respective classes are represented by a single point. Afterward, we formulate the separation of the saliency regions from the background as a distance measurement to this single point in the null space. An optimization algorithm is devised to fuse the background samples based saliency map and foreground samples based saliency map. Results on five benchmark datasets show that the proposed method achieves superior performance compared with the newest state-of-the-art methods in terms of different evaluation metrics, especially for complex natural images. |
Databáze: | OpenAIRE |
Externí odkaz: |