Light is scattered and absorbed in the atmosphere producing visual effects that increase with viewing distance. Contrast and chromatic diversity decrease with distance, thereby modifying the way objects and scenes are perceived. Although some perceptual compensation to produce color constancy of individual surfaces has been reported, it is unclear to what extent unknown original images can be visually inferred from their distorted versions, that is, how much these effects can be discounted by the visual system. We investigated this issue with a paradigm akin to the paper‐matching paradigm used in color constancy studies but with complex natural images. Hyperspectral data from 11 natural scenes were used to simulate their colors for distances up to 2000 m using a precise physical model of the effects of atmosphere. In each trial of the experiment, observers viewed the range of simulated images derived from one scene displayed on a calibrated monitor and selected the one perceived as more natural, without color distortion. Pooling data across scenes and observers showed that the image selected as more natural was very close to the undistorted one, corresponding to a simulated distance of only 2 m. These results suggest that observers are sensitive to changes in the naturalness of colors due to atmospheric effects and, crucially, can retrieve the original chromatic content with good accuracy revealing an efficient form of color constancy.