Empirical investigations of ideomotor effect anticipations have mainly focused on action effects in the environment. By contrast, action effects that apply to the agent's body have rarely been put to the test in corresponding experimental paradigms. We present a series of experiments using the response-effect compatibility paradigm, in which we studied the impacts of to-be-produced tactile action effects on action selection, initiation, and execution. The results showed a robust and reliable impact if these tactile action effects were rendered task-relevant (Exp. 1), but not when they were task-irrelevant (Exps. 2a and 2b). We further showed that anticipations of tactile action effects follow the same time course as anticipations of environment-related effects (Exps. 3 and 4). These findings demonstrate that body-related action effects affect action control much as environment-related effects do, and therefore support the theoretical assumption of the functional equivalence of all types of action effects.
Spatial target information for movement planning appears to be coded in a gaze-centered reference frame. In touch, however, location is initially coded with reference to the skin. Therefore, the tactile spatial location must be derived by integrating skin location and posture. It has been suggested that this recoding is impaired when the limb is placed in the opposite hemispace, for example, by limb crossing. Here, human participants reached toward visual and tactile targets located at uncrossed and crossed feet in a sensorimotor decision task. We characterized stimulus recoding by analyzing the timing and spatial profile of hand reaches. For tactile targets at crossed feet, skin-based information implicates the incorrect side, and only recoded information points to the correct location. Participants initiated straight reaches and redirected the hand toward a target presented in midflight. Trajectories to visual targets were unaffected by foot crossing. In contrast, trajectories to tactile targets were redirected later with crossed than uncrossed feet. Reaches to crossed feet usually continued straight until they were directed toward the correct tactile target and were not biased toward the skin-based target location. Occasional, far deflections toward the incorrect target were most likely when this target was implicated by trial history. These results are inconsistent with the suggestion that spatial transformations in touch are impaired by limb crossing, but are consistent with tactile location being recoded rapidly and efficiently, followed by integration of skin-based and external information to specify the reach target. This process may be implemented in a bounded integrator framework.
Visual spatial information is paramount in guiding bimanual coordination, but anatomical factors, too, modulate performance in bimanual tasks. Vision conveys not only abstract spatial information, but also informs about body-related aspects such as posture. Here, we asked whether, accordingly, visual information induces body-related, or merely abstract, perceptual-spatial constraints in bimanual movement guidance. Human participants made rhythmic, symmetrical and parallel, bimanual index finger movements with the hands held in the same or different orientations. Performance was more accurate for symmetrical than parallel movements in all postures, but additionally when homologous muscles were concurrently active, such as when parallel movements were performed with differently rather than identically oriented hands. Thus, both perceptual and anatomical constraints were evident. We manipulated visual feedback with a mirror between the hands, replacing the image of the right with that of the left hand and creating the visual impression of bimanual symmetry independent of the right hand's true movement. Symmetrical mirror feedback impaired parallel, but improved symmetrical bimanual performance compared with regular hand view. Critically, these modulations were independent of hand posture and muscle homology. Thus, visual feedback appears to contribute exclusively to spatial, but not to body-related, anatomical movement coding in the guidance of bimanual coordination.Whether we type on a keyboard, applaud, or ride a bike -bimanual coordination is crucial in many of our everyday activities. Therefore, the principles that guide bimanual coordination have received much interest, not least to inform treatment to restore regular bimanual function in clinical settings. Beyond therapeutic considerations, coordinative action can be viewed as an ecologically valid model to understand the principles of movement planning 1 . Accordingly, experiments have studied the factors that constrain bimanual movement execution. A prominent and consistent finding has been that humans can perform symmetrical movements -with symmetry usually defined relative to the sagittal body midline -with higher precision and at higher speeds than parallel movements [2][3][4] . During symmetrical movements, the two effectors move towards opposite sides of space; for instance, one hand moves to the right while the other concurrently moves to the left. Conversely, parallel movements implicate movements towards the same direction of space; for instance, both hands synchronously move to the left or to the right.The symmetry bias has been demonstrated across a variety of effectors and movement types, such as finger flexion and extension 5,6
Visual spatial information is paramount in guiding bimanual coordination, but anatomical factors, too, modulate performance in bimanual tasks. Vision conveys not only abstract spatial information, but also informs about body-related aspects such as posture. Here, we asked whether, accordingly, visual information induces body-related, or merely abstract, perceptual-spatial constraints in bimanual movement guidance. Human participants made rhythmic, symmetrical and parallel, bimanual index finger movements with the hands held in the same or different orientations. Performance was more accurate for symmetrical than parallel movements in all postures, but additionally when homologous muscles were concurrently active, such as when parallel movements were performed with differently rather than identically oriented hands. Thus, both perceptual and anatomical constraints were evident. We manipulated visual feedback with a mirror between the hands, replacing the image of the right with that of the left hand and creating the visual impression of bimanual symmetry independent of the right hand’s true movement. Symmetrical mirror feedback impaired parallel, but improved symmetrical bimanual performance compared with regular hand view. Critically, these modulations were independent of hand posture and muscle homology. Thus, visual feedback appears to contribute exclusively to spatial, but not to body-related, anatomical movement coding in the guidance of bimanual coordination.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.