Learning from visual experience is crucial for perceptual development. One crucial question is when this learning occurs and to what extent it compensates for changes in the visual system throughout life. To address this question, it is essential to compare human performance not only to the hypothetical state of no recalibration, but also to the ideal scenario of optimum learning given the information available from visual exposure. In the adult eye, macular pigment introduces nonhomogeneity in color filtering between the very center of vision and the periphery, which is known to introduce perceptual differences. By modeling cone responses to the spectra of everyday stimuli, we quantify the degree of calibration possible from visual exposure, and therefore the perceptual color distortion that should occur with and without recalibration. We find that perceptual distortions were halfway between those predicted from bare adaptation and from learning, despite nearly lifelong exposure to a very systematic bias. We also show that these distortions affect real stimuli and are already robust in the near-periphery. Our findings challenge an assumption that has fueled influential accounts of vision-that the apparent homogeneity of perceived colors across the visual field in everyday life is evidence for continuous learning in perception. Since macular pigment is absent at birth and reaches adult levels before age 2, we argue that the most plausible, though likely controversial, interpretation of our results is early development of color constancy across space and not much recalibration afterwards.