In this paper we develop a computational model of visual adaptation for realistic image synthesis based on psychophysical experiments. The model captures the changes in threshold visibility, color appearance, visual acuity, and sensitivity over time that are caused by the visual system's adaptation mechanisms. We use the model to display the results of global illumination simulations illuminated at intensities ranging from daylight down to starlight. The resulting images better capture the visual characteristics of scenes viewed over a wide range of illumination levels. Because the model is based on psychophysical data it can be used to predict the visibility and appearance of scene features. This allows the model to be used as the basis of perceptually-based error metrics for limiting the precision of global illumination computations.
In this paper we present a multiscale color appearance model which simulates luminance, pattern and color processing of the human visual system to accurately predict the color appearance attributes of spectral stimuli in complex surroundings under a wide range of illumination and viewing conditions.
We introduce a new concept for accelerating realistic image synthesis algorithms. At the core of this procedure is a novel physical error metric that correctly predicts the perceptual threshold for detecting artifacts in scene features. Built into this metric is a computational model of the human visual system's loss of sensitivity at high background illumination levels, high spatial frequencies, and high contrast levels (visual masking). An important feature of our model is that it handles the luminance-dependent processing and spatiallydependent processing independently. This allows us to precompute the expensive spatially-dependent component, making our model extremely efficient.We illustrate the utility of our procedure with global illumination algorithms used for realistic image synthesis. The expense of global illumination computations is many orders of magnitude higher than the expense of direct illumination computations and can greatly benefit by applying our perceptually based technique. Results show our method preserves visual quality while achieving significant computational gains in areas of images with high frequency texture patterns, geometric details, and lighting variations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.