Despite growing evidence for perceptual interactions between motion and position, no unifying framework exists to account for these two key features of our visual experience. We show that percepts of both object position and motion derive from a common object-tracking system-a system that optimally integrates sensory signals with a realistic model of motion dynamics, effectively inferring their generative causes. The object-tracking model provides an excellent fit to both position and motion judgments in simple stimuli. With no changes in model parameters, the same model also accounts for subjects' novel illusory percepts in more complex moving stimuli. The resulting framework is characterized by a strong bidirectional coupling between position and motion estimates and provides a rational, unifying account of a number of motion and position phenomena that are currently thought to arise from independent mechanisms. This includes motion-induced shifts in perceived position, perceptual slow-speed biases, slowing of motions shown in visual periphery, and the well-known curveball illusion. These results reveal that motion perception cannot be isolated from position signals. Even in the simplest displays with no changes in object position, our perception is driven by the output of an objecttracking system that rationally infers different generative causes of motion signals. Taken together, we show that object tracking plays a fundamental role in perception of visual motion and position.visual motion perception | Kalman filter | object tracking | causal inference | motion-induced position shift R esearch into the basic mechanisms of visual motion processing has largely focused on simple cases in which motion signals are fixed in space and constant over time (e.g., moving patterns presented in static windows) (1). Although this approach has resulted in considerable advances in our understanding of low-level motion mechanisms, it leaves open the question of how the brain integrates changing motion and position signals; when objects move in the world, motion generally co-occurs with changes in object position. The process of generating coherent estimates of object motion and position is known in the engineering and computer vision literature as "tracking" (e.g., as used by the Global Positioning System) (2). Conceptualizing motion and position perception in the broader context of object tracking suggests an alternative conceptual framework-one that we show provides a unifying account for a number of perceptual phenomena.An optimal tracking system would integrate incoming position and motion signals with predictive information from the recent past to continuously update perceptual estimates of both an object's position and its motion. Were such a system to underlie perception, position and motion should be perceptually coupled in predictable ways. Signatures of such a coupling appear in a number of known phenomena. On one hand, local motion signals can predictively bias position percepts (3-8). On the other hand, we can pe...