After the presentation of a visual stimulus, cortical visual processing cascades from low-level sensory features in primary visual areas to increasingly abstract representations in higherlevel areas. It is often hypothesized that the reverse process underpins the human ability to generate mental images. Under this hypothesis, visual information feeds back from high-level areas as abstract representations are used to construct the sensory representation in primary visual cortices. Such reversals of information flow are also hypothesized to play a central role in later stages of perception. According to predictive processing theories, ambiguous sensory information is resolved using abstract representations coming from high-level areas through oscillatory rebounds between different levels of the visual hierarchy. However, despite the elegance of these theoretical models, to this day there is no direct experimental evidence of the reversion of visual information flow during mental imagery and perception. In the first part of this paper, we provide direct evidence in humans for a reverse order of activation of the visual hierarchy during imagery. Specifically, we show that classification machine learning models trained on brain data at different time points during the early feedforward phase of perception are reactivated in reverse order during mental imagery. In the second part of the paper, we report an 11Hz oscillatory pattern of feedforward and reversed visual processing phases during perception. Together, these results are in line with the idea that during perception, the high-level cause of sensory input is inferred through recurrent hypothesis updating, whereas during imagery, this learned forward mapping is reversed to generate sensory signals given abstract representations.
Perception | Mental imagery | MEG | Multivariate pattern analysis | Generative models | Predictive processingCorrespondence: n.dijkstra@donders.ru.nl When light hits the retina, a complex cascade of neural processing is triggered. Light waves are transformed into electrical signals that travel via the lateral geniculate nucleus of the thalamus to the visual cortex (1, 2). First, low-level visual features such as orientation and spatial frequency are processed in primary, posterior visual areas (3) after which activation spreads forward towards secondary, more anterior visual areas where high-level features such as shape and eventually semantic category are processed (4-6). This initial feedforward flow through the visual hierarchy is completed within 150 ms (7,8) after which feedback processes are assumed to further sharpen representations over time until a stable percept is achieved (9, 10). Activation in visual areas can also be triggered internally, in the absence of external sensory signals. During mental imagery, information from memory is used to generate rich visual representations. Neural representations activated during imagery are highly similar to those activated during perception (11). Imagining an object activates similar obje...