Motivated by the success of convolutional neural networks (CNNs) in image-related applications, in this paper, we design an effective method for no-reference 3D image quality assessment (3D IQA) through CNN-based feature extraction and consolidation strategy. In the first and most vital stage, qualityaware features, which reflect the inherent quality of images, are extracted by a fine-tuned CNN model exploiting the concept of transfer learning. This fine-tuning strategy solves the large-scale training data dependence existing in current deep-learning-based IQA algorithms. In the second stage, features from the left and right view are consolidated by linear weighted fusion where the weight for each image is obtained from its saliency map. In addition, the statistical characteristics of the disparity map are also considered in a multi-scale manner as additional features. In the final stage of quality mapping, the objective score for each stereoscopic pair is gained by support vector regression. The experimental results on the public databases show that our approach outperforms many existing no-reference and even full-reference methods. INDEX TERMS No-reference 3D image quality assessment, deep neural network, transfer learning.
Image quality assessment is of fundamental importance for various image processing applications. A novel method is presented in which the joint occurrences of statistical local representation by log-Gabor filters and texture analysis by local tetra patterns and histograms of colour are considered as quality-aware features. Then the dissimilarities of these features between the distorted and reference images are quantified and mapped into quality score prediction by utilising a support vector regression. Extensive experiments on LIVE, CSIQ and TID databases show that the proposed method is remarkably consistent with human perception and outperforms many state-of-the-art methods, and also it is robust across different distortion types and different databases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.