Many quality metrics take as input gamma corrected images and assume that pixel code values are scaled perceptually uniform. Although this is a valid assumption for darker displays operating in the luminance range typical for CRT displays (from 0.1 to 80 cd/m 2 ), it is no longer true for much brighter LCD displays (typically up to 500 cd/m 2 ), plasma displays (small regions up to 1000 cd/m 2 ) and HDR displays (up to 3000 cd/m 2 ). The distortions that are barely visible on dark displays become clearly noticeable when shown on much brighter displays. To estimate quality of images shown on bright displays, we propose a straightforward extension to the popular quality metrics, such as PSNR and SSIM, that makes them capable of handling all luminance levels visible to the human eye without altering their results for typical CRT display luminance levels. Such extended quality metrics can be used to estimate quality of high dynamic range (HDR) images as well as account for display brightness.
Figure 1: Quality assessment of an LDR image (left), generated by tone-mapping the reference HDR (center) using Pattanaik's tone-mapping operator. Our metric detects loss of visible contrast (green) and contrast reversal (red), visualized as an in-context distortion map (right). AbstractThe diversity of display technologies and introduction of high dynamic range imagery introduces the necessity of comparing images of radically different dynamic ranges. Current quality assessment metrics are not suitable for this task, as they assume that both reference and test images have the same dynamic range. Image fidelity measures employed by a majority of current metrics, based on the difference of pixel intensity or contrast values between test and reference images, result in meaningless predictions if this assumption does not hold. We present a novel image quality metric capable of operating on an image pair where both images have arbitrary dynamic ranges. Our metric utilizes a model of the human visual system, and its central idea is a new definition of visible distortion based on the detection and classification of visible changes in the image structure. Our metric is carefully calibrated and its performance is validated through perceptual experiments. We demonstrate possible applications of our metric to the evaluation of direct and inverse tone mapping operators as well as the analysis of the image appearance on displays with various characteristics.
Figure 1: One application of our method is the temporally consistent propagation of scribbles through video volumes. Sparse feature correspondences from an input video (a) are used to compute optical flow (c). Then, color scribbles (b) are spread in space and time to compute the final coherent output (d). AbstractWe present an efficient and simple method for introducing temporal consistency to a large class of optimization driven image-based computer graphics problems. Our method extends recent work in edge-aware filtering, approximating costly global regularization with a fast iterative joint filtering operation. Using this representation, we can achieve tremendous efficiency gains both in terms of memory requirements and running time. This enables us to process entire shots at once, taking advantage of supporting information that exists across far away frames, something that is difficult with existing approaches due to the computational burden of video data. Our method is able to filter along motion paths using an iterative approach that simultaneously uses and estimates per-pixel optical flow vectors. We demonstrate its utility by creating temporally consistent results for a number of applications including optical flow, disparity estimation, colorization, scribble propagation, sparse data up-sampling, and visual saliency computation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.