Predictable somatosensory feedback leads to a reduction in tactile sensitivity. This phenomenon, called tactile suppression, relies on a mechanism that uses an efference copy of motor commands to help select relevant aspects of incoming sensory signals. We investigated whether tactile suppression is modulated by (a) the task-relevancy of the predicted consequences of movement and (b) the intensity of related somatosensory feedback signals. Participants reached to a target region in the air in front of a screen; visual or tactile feedback indicated the reach was successful. Furthermore, tactile feedback intensity (strong vs. weak) varied across two groups of participants. We measured tactile suppression by comparing detection thresholds for a probing vibration applied to the finger either early or late during reach and at rest. As expected, we found an overall decrease in late-reach suppression, as no touch was involved at the end of the reach. We observed an increase in the degree of tactile suppression when strong tactile feedback was given at the end of the reach, compared to when weak tactile feedback or visual feedback was given. Our results suggest that the extent of tactile suppression can be adapted to different demands of somatosensory processing. Downregulation of this mechanism is invoked only when the consequences of missing a weak movement sequence are severe for the task. The decisive factor for the presence of tactile suppression seems not to be the predicted action effect as such, but the need to detect and process anticipated feedback signals occurring during movement.
In everyday life humans are confronted with changing environmental demands. In order to act successfully and achieve intended goals, action control is required. A recent approach, the Binding and Retrieval in Action Control (BRAC) framework, attempts to provide an overarching perspective on action control. Based on basic principles such as binding and retrieval, findings from several experimental paradigms could be integrated. However, the focus so far has been on rather artificial paradigms involving very simple motor response requirements, like finger lifting or button presses. We aimed to extend the BRAC framework to more complex movements consisting of a sequence of several discrete actions. Participants were asked to grasp and lift an object with an uneven mass distribution. Object features, like mass distribution and position, were either kept constant on a global level or varied in a pseudorandomized manner. When both object features were kept constant, participants were able to adjust their grasp so that it resulted in a more stable lift and less object roll. Further, with randomly mixed object features, we found best task performance when both object features were completely repeated from one trial to the other. These results suggest that tasks with more complex movements are capable of reflecting principles of action control as defined by the BRAC framework. This offers the possibility to test these principles in even more complex and ecologically relevant paradigms to improve our understanding of everyday life actions.
Humans can judge the quality of their perceptual decisions—an ability known as perceptual confidence. Previous work suggested that confidence can be evaluated on an abstract scale that can be sensory modality-independent or even domain-general. However, evidence is still scarce on whether confidence judgments can be directly made across visual and tactile decisions. Here, we investigated in a sample of 56 adults whether visual and tactile confidence share a common scale by measuring visual contrast and vibrotactile discrimination thresholds in a confidence-forced choice paradigm. Confidence judgments were made about the correctness of the perceptual decision between two trials involving either the same or different modalities. To estimate confidence efficiency, we compared discrimination thresholds obtained from all trials to those from trials judged to be relatively more confident. We found evidence for metaperception because higher confidence was associated with better perceptual performance in both modalities. Importantly, participants were able to judge their confidence across modalities without any costs in metaperceptual sensitivity and only minor changes in response times compared to unimodal confidence judgments. In addition, we were able to predict cross-modal confidence well from unimodal judgments. In conclusion, our findings show that perceptual confidence is computed on an abstract scale and that it can assess the quality of our decisions across sensory modalities.
When interacting with objects, we often rely on visual information. However, vision is not always the most reliable sense for determining relevant object properties. For example, when the mass distribution of an object cannot be inferred visually, humans may rely on predictions about the objects dynamics. Such predictions may not only influence motor behavior but also associated somatosensory processing, as sensorimotor predictions lead to reduced tactile sensitivity during movement. We examined whether predictions based on sensorimotor memories influence grasping kinematics and associated tactile processing. Participants lifted an object of unknown mass distribution and reported whether they detected a tactile stimulus on their grasping hand during the lift. In Experiment 1, the mass distribution could change from trial to trial, whereas in Experiment 2, we intermingled longer with shorter parts of constant and variable mass distributions, while also providing implicit or explicit information about the trial structure. In both experiments, participants grasped the object by predictively choosing contact points that would compensate the mass distribution experienced in the previous trial. Tactile suppression during movement, however, was invariant across conditions. These results suggest that predictions based on sensorimotor memories can influence movement kinematics but may not affect associated tactile perception.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.