Human behavior in natural tasks consists of an intricately coordinated dance of cognitive, perceptual, and motor activities. While much research has progressed in understanding the nature of cognitive, perceptual, or motor processing in isolation or in highly constrained settings, few studies have sought to examine how these systems are coordinated in the context of executing complex behavior. Previous research has suggested that in the course of visually-guided reaching movements, the eye and hand are yoked, or linked in a nonadaptive manner. In this work we report an experiment that manipulated the demands that a task placed on the motor and visual systems, and then examined in detail the resulting changes in visuomotor coordination. We develop an ideal actor model that predicts the optimal coordination of vision and motor control in our task. On the basis of our model’s predictions, we demonstrate that human performance in our experiment reflects an adaptive response to the varying costs imposed by our experimental manipulations. Our results stand in contrast to previous theories that have assumed a fixed control mechanism for coordinating vision and motor control in reaching behavior.