VP-NIQE: An opinion-unaware visual perception natural image quality evaluator
Autor: | Hua Chen, Leyuan Wu, Xiaogang Zhang, Dingxiang Wang, Jingfang Deng |
---|---|
Rok vydání: | 2021 |
Předmět: |
Visual perception
business.industry Computer science Image quality Cognitive Neuroscience media_common.quotation_subject Perspective (graphical) Pattern recognition Computer Science Applications Artificial Intelligence Feature (computer vision) Histogram Perception Human visual system model Benchmark (computing) Artificial intelligence business media_common |
Zdroj: | Neurocomputing. 463:17-28 |
ISSN: | 0925-2312 |
Popis: | The opinion-unaware (OU) blind image quality assessment (IQA) method shows broad application prospects due to the explosive growth of unlabelled images. Under a local quality perception framework, many works have focused on extracting more complex natural scene statistical (NSS) features to boost the performance. The framework, however, is seldom studied. In addition, NSS features are designed using the prior knowledge of the human visual system (HVS), which is insufficiently powerful for broad types of distortions. Therefore, this paper proposes a visual perception nature image quality evaluation (VP-NIQE) model. Specifically, we propose an understanding-based global–local structure IQA model to simulate the top-down structure of the HVS in image perception. Furthermore, we enriched the feature types from a more comprehensive perspective in local distortion detection. In addition to the NSS features, histogram and deep-learned features are incorporated to detect the distortions. NSS and histogram features represent the low-level prior knowledge of the HVS, while deep-learned features represent the high-level information behind the data. The proposed method is evaluated on six benchmark databases. The evaluation results demonstrate that the proposed method achieves state-of-the-art performance compared with existing OU methods and even generates competitive performance compared with classical full-reference methods. |
Databáze: | OpenAIRE |
Externí odkaz: |