Bayesian observer models provide a principled account of the fact that our perception of the world rarely matches physical reality. The standard explanation is that our percepts are biased toward our prior beliefs. However, reported psychophysical data suggest that this view may be simplistic. We propose a new model formulation based on efficient coding that is fully specified for any given natural stimulus distribution. The model makes two new and seemingly anti-Bayesian predictions. First, it predicts that perception is often biased away from an observer's prior beliefs. Second, it predicts that stimulus uncertainty differentially affects perceptual bias depending on whether the uncertainty is induced by internal or external noise. We found that both model predictions match reported perceptual biases in perceived visual orientation and spatial frequency, and were able to explain data that have not been explained before. The model is general and should prove applicable to other perceptual variables and tasks.
Grid cells in the entorhinal cortex appear to represent spatial location via a triangular coordinate system. Such cells, which have been identified in rats, bats, and monkeys, are believed to support a wide range of spatial behaviors. By recording neuronal activity from neurosurgical patients performing a virtual-navigation task we identified cells exhibiting grid-like spiking patterns in the human brain, suggesting that humans and simpler animals rely on homologous spatial-coding schemes.
Perception of a stimulus can be characterized by two fundamental psychophysical measures: how well the stimulus can be discriminated from similar ones (discrimination threshold) and how strongly the perceived stimulus value deviates on average from the true stimulus value (perceptual bias). We demonstrate that perceptual bias and discriminability, as functions of the stimulus value, follow a surprisingly simple mathematical relation. The relation, which is derived from a theory combining optimal encoding and decoding, is well supported by a wide range of reported psychophysical data including perceptual changes induced by contextual modulation. The large empirical support indicates that the proposed relation may represent a psychophysical law in human perception. Our results imply that the computational processes of sensory encoding and perceptual decoding are matched and optimized based on identical assumptions about the statistical structure of the sensory environment.
Grid cells in the brain respond when an animal occupies a periodic lattice of ‘grid fields’ during navigation. Grids are organized in modules with different periodicity. We propose that the grid system implements a hierarchical code for space that economizes the number of neurons required to encode location with a given resolution across a range equal to the largest period. This theory predicts that (i) grid fields should lie on a triangular lattice, (ii) grid scales should follow a geometric progression, (iii) the ratio between adjacent grid scales should be √e for idealized neurons, and lie between 1.4 and 1.7 for realistic neurons, (iv) the scale ratio should vary modestly within and between animals. These results explain the measured grid structure in rodents. We also predict optimal organization in one and three dimensions, the number of modules, and, with added assumptions, the ratio between grid periods and field widths.DOI:
http://dx.doi.org/10.7554/eLife.08362.001
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.