Toward A No-reference Omnidirectional Image Quality Evaluation by Using Multi-perceptual Features
Autor: | Yun Liu, Xiaohua Yin, Zuliang Wan, Guanghui Yue, Zhi Zheng |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Zdroj: | ACM Transactions on Multimedia Computing, Communications, and Applications. 19:1-19 |
ISSN: | 1551-6865 1551-6857 |
DOI: | 10.1145/3549544 |
Popis: | Compared to ordinary images, omnidirectional image (OI) usually has a broader view and a higher resolution, and image quality assessment (IQA) can help people to understand and improve their visual experience. However, the current IQA works cannot achieve good performance. To address this, we proposed a novel visual perception-based no-reference/blind omnidirectional image quality assessment (NR/B-OIQA) model. The gradient-based global structural features and gray-level co-occurrence matrix-based local structural features are combined together to highlight the rich quality-aware structural information. And a novel steganalysis real model-based color descriptor is extracted to reflect the color information that ignored in most IQA models. With a multi-scale visual perception, we take image entropy and the natural scene statistics features to convey the high-level semantics and quantify the unnaturalness of omnidirectional images. Finally, we apply support vector regression to predict the objective quality value based on the subjective scores and extracted all features. Experiments are conducted on OIQA and CVIQD2018 Databases, and the results illustrate that our model has more reliable performance and stronger competitiveness and receives better conformity with the subjective values. |
Databáze: | OpenAIRE |
Externí odkaz: |