Humans and other primates rely on eye movements to explore visual scenes and to track moving objects. As a result, the image that is projected onto the retina -and propagated throughout the visual cortical hierarchy -is almost constantly changing and makes little sense without taking into account the momentary direction of gaze. How is this achieved in the visual system? Here we show that in primary visual cortex (V1), the earliest stage of cortical vision, neural representations carry an embedded "eye tracker" that signals the direction of gaze associated with each image. Using chronically implanted multi-electrode arrays, we recorded the activity of neurons in V1 during tasks requiring fast (exploratory) and slow (pursuit) eye movements. Neurons were stimulated with flickering, full-field luminance noise at all times. As in previous studies 1-4 , we observed neurons that were sensitive to gaze direction during fixation, despite comparable stimulation of their receptive fields. We trained a decoder to translate neural activity into metric estimates of (stationary) gaze direction. This decoded signal not only tracked the eye accurately during fixation, but also during fast and slow eye movements, even though the decoder had not been exposed to data from these behavioural states. Moreover, this signal lagged the real eye by approximately the time it took for new visual information to travel from the retina to cortex. Using simulations, we show that this V1 eye position signal could be used to take into account the sensory consequences of eye movements and map the fleeting positions of objects on the retina onto their stable position in the world.During everyday vision, fast eye movements known as 'saccades' shift the line of sight to place the high-resolution fovea onto objects of interest. These movements are complemented by slower, smooth eye rotations used to track moving objects (e.g. a bird in flight) or to lock gaze on an object during locomotion 5 . Though essential, these movements have stark consequences for the retina's view of the world: objects that are stationary appear to jump or move, and objects that are in motion travel along grossly distorted trajectories (Fig 1). These phenomena reflect the fact that visual input arrives to the nervous system in an eye-centred (or retinal) coordinate system, in which objects evoke activity within a local patch of photoreceptors and ganglion cells (i.e. as a place code). Our perception of the world, however, is entirely different: objects do not appear to change positions every time we move our eyes, and we have no trouble planning actions toward them. This suggests that the visual system uses internal knowledge of eye position (and