2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX) 2016
DOI: 10.1109/qomex.2016.7498922
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating color difference measures in images

Abstract: The most well known and widely used method for comparing two homogeneous color samples is the CIEDE2000 color difference formula because of its strong agreement with human perception. However, the formula is unreliable when applied over images and its spatial extensions have shown little improvement compared with the original formula. Hence, researchers have proposed many methods intending to measure color differences (CDs) in natural scene color images. However, these existing methods have not yet been rigoro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 30 publications
0
4
0
Order By: Relevance
“…Although HDR-VDP-2.2 has a lower performance on combined dataset compared to its performance on individual databases, it is among the three most correlated metrics with HDR-VQM and PU-MSSIM on the case excluding Database #2. Interestingly, the HDR-VQM metric, which has been designed to predict video fidelity, gives excellent results also in the case of static images, and (Ortiz-Jaramillo et al 2016), they have lower correlation scores when compared to luminance-only metrics. In fact, this result is not in disagreement with Ortiz-Jaramillo et al (2016), which did not consider compression artifacts in the experiments, as the impact of those on image quality was deemed to be much stronger than color differences.…”
Section: Discussionmentioning
confidence: 99%
“…Although HDR-VDP-2.2 has a lower performance on combined dataset compared to its performance on individual databases, it is among the three most correlated metrics with HDR-VQM and PU-MSSIM on the case excluding Database #2. Interestingly, the HDR-VQM metric, which has been designed to predict video fidelity, gives excellent results also in the case of static images, and (Ortiz-Jaramillo et al 2016), they have lower correlation scores when compared to luminance-only metrics. In fact, this result is not in disagreement with Ortiz-Jaramillo et al (2016), which did not consider compression artifacts in the experiments, as the impact of those on image quality was deemed to be much stronger than color differences.…”
Section: Discussionmentioning
confidence: 99%
“…In order to approximate the subjective score of the color difference due to gamut reduction in an objective manner, we employ the color extension of the structural similarity index (cssim) [49], [50], which can effectively measure perceptually significant structural differences due to gamut reduction between two color images. The preliminary study [1] shows that it performs best with the highest accuracy among eight commonly-used objective color difference metrics [51], [52], [53], [54], [55], [56], [57].…”
Section: Fitting Objective Metricmentioning
confidence: 99%
“…For instance, Ortiz-Jaramill et al [48] demonstrated that current color difference measures (i.e., FR-IQA methods that compute color differences between processed and reference images) present little correlation with subjective quality scores. Also, even though some DS-IQA methods are able to predict the quality of contrast-distorted images [49], most GP-IQA methods have a poor prediction performance.…”
Section: Introductionmentioning
confidence: 99%