As the dynamic range of a digital camera is narrower than that of a real scene, the captured image requires a tone curve or contrast correction to reproduce the information in dark regions. Yet, when using a global correction method, such as histogrambased methods and gamma correction, an unintended contrast enhancement in bright regions can result. Thus, a multiscale retinex algorithm using Gaussian filters was already proposed to enhance the local contrast of a captured image using the ratio between the intensities of an arbitrary pixel in the captured image and its surrounding pixels. The intensity of the surrounding pixels is estimated using Gaussian filters and weights for each filter, and to obtain better results, these Gaussian filters and weights are adjusted in relation to the captured image. Nonetheless, this adjustment is currently a subjective process, as no method has yet been developed for optimizing the Gaussian filters and weights according to the captured image. Therefore, this article proposes local contrast enhancement based on an adaptive multiscale retinex using a Gaussian filter set adapted to the input image. First, the weight of the largest Gaussian filter is determined using the local contrast ratio from the intensity distribution of the input image. The other Gaussian filters and weights for each Gaussian filter in the multiscale retinex are then determined using a visual contrast measure and the maximum color difference of the color patches in the Macbeth color checker. The visual contrast measure is obtained based on the product of the local standard deviation and locally averaged luminance of the image. Meanwhile, to evaluate the halo artifacts generated in large uniform regions that abut to form a high contrast edge, the artifacts are evaluated based on the maximum color difference between each color of the pixels in a patch in the Macbeth color and the averaged color in CIELAB standard color space. When considering the color difference for halo artifacts, the parameters for the Gaussian filters and weights representing a higher visual contrast measure are determined using test images. In addition, to reduce the induced graying-out, the chroma of the resulting image is compensated by preserving the chroma ratio of the input image based on the maximum chroma values of the sRGB color gamut in the lightness-chroma plane. In experiments, the proposed method is shown to improve the local contrast and saturation in a natural way. INTRODUCTIONHuman vision is a complicated automatic self-adapting system that is capable of seeing over 5 orders of magnitude simultaneously and can gradually adapt to natural world scenes with a high dynamic range of over 9 orders of magnitude. Thus, human vision can concurrently perceive
Every type of color imaging device has its own color reproduction method, providing color characteristic information. Thus, a color management system with an ICC profile is generally used to reproduce colors between two different color imaging devices, for example, from a monitor to a printer. However, once the ICC profile of a device is measured and stored, it usually remains unchanged. Yet, a user can sometimes control the monitor configuration, such as the color temperature, contrast, and brightness, according to their preference, thereby changing the color characteristics of the monitor. In addition, typical end user's viewing condition is not matched to standard environment. In this case, if the user then prints an image on the monitor screen, the color of the printed image will not match the color displayed on the monitor screen, as the color characteristics of the ICC profile provided by the monitor manufacturer will no longer represent the user-configured color characteristics of the monitor. Therefore, this article proposes a method for user-configured monitor-to-printer color reproduction based on an estimation of the monitor characteristics under user's monitor configuration and viewing condition. First, the color characteristics according to change of monitor configuration is measured and modeled for the red-green-blue (RGB) chromaticity and tone curve. Second, to estimate the color characteristics of the user's monitor, color matching between a printed color chart and a reproduced color chart image on the monitor screen by soft proofing is accomplished. The RGB chromaticity and tone curve models are used in a soft proofing process for adjusting the monitor's color characteristics. After color matching, color characteristics of the user's monitor are then obtained. Finally, the monitor to printer color reproduction is evaluated based on the color characteristics obtained for the monitor. Experimental results using the proposed method show that the printed images have almost the same colors as the images on the real user-configured monitor screen.
This paper presents a color enhancement algorithm based on multi-scale gray world algorithm for faded images. First, the proposed method adopts local process by using multi-scale mask. The coefficients for each multi-scale mask are obtained to apply the gray world algorithm. Then, integrating the coefficients with weights is performed to calculate correction ratio for red and blue channels in the gray world assumption. Finally, the corrected image is obtained by applying the integrated coefficients to the gray world algorithm. In the experimental results, the proposed method reproduces better colors for both wholly and partially faded images compared with the previous methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.