Sensory perception is a product of interactions between the internal state of an organism and the physical attributes of a stimulus. It has been shown across the animal kingdom that perception and sensory-evoked physiological responses are modulated depending on whether or not the stimulus is the consequence of voluntary actions. These phenomena are often attributed to motor signals sent to relevant sensory regions that convey information about upcoming sensory consequences. However, the neurophysiological signature of action-locked modulations in sensory cortex, and their relationship with perception, is still unclear. In the current study, we recorded neurophysiological (using Magnetoencephalography) and behavioral responses from 16 healthy subjects performing an auditory detection task of faint tones. Tones were either generated by subjects’ voluntary button presses or occurred predictably following a visual cue. By introducing a constant temporal delay between button press/cue and tone delivery, and applying source-level analysis, we decoupled action-locked and auditory-locked activity in auditory cortex. We show action-locked evoked-responses in auditory cortex following sound-triggering actions and preceding sound onset. Such evoked-responses were not found for button-presses that were not coupled with sounds, or sounds delivered following a predictive visual cue. Our results provide evidence for efferent signals in human auditory cortex that are locked to voluntary actions coupled with future auditory consequences.
Voluntary actions are shaped by desired goals and internal intentions. Multiple factors, including the planning of subsequent actions and the expectation of sensory outcome, were shown to modulate kinetics and neural activity patterns associated with similar goal-directed actions. Notably, in many real-world tasks, actions can also vary across the semantic meaning they convey, although little is known about how semantic meaning modulates associated neurobehavioral measures. Here, we examined how behavioral and functional magnetic resonance imaging measures are modulated when subjects execute similar actions (button presses) for two different semantic meanings—to answer “yes” or “no” to a binary question. Our findings reveal that, when subjects answer using their right hand, the two semantic meanings are differentiated based on voxel patterns in the frontoparietal cortex and lateral-occipital complex bilaterally. When using their left hand, similar regions were found, albeit only with a more liberal threshold. Although subjects were faster to answer “yes” versus “no” when using their right hand, the neural differences cannot be explained by these kinetic differences. To the best of our knowledge, this is the first evidence showing that semantic meaning is embedded in the neural representation of actions, independent of alternative modulating factors such as kinetic and sensory features.
Accurate control over everyday goal-directed actions is mediated by sensory-motor predictions of intended consequences and their comparison with actual outcomes. Such online comparisons of the expected and re-afferent, immediate, sensory feedback are conceptualized as internal forward models. Current predictive coding theories describing such models typically address the processing of immediate sensory-motor goals, yet voluntary actions are also oriented towards long-term conceptual goals and intentions, for which the sensory consequence is sometimes absent or cannot be fully predicted. Thus, the neural mechanisms underlying actions with distal conceptual goals is far from being clear. Specifically, it is still unknown whether sensory-motor circuits also encode information regarding the global meaning of the action, detached from the immediate, movement-related goal. Therefore, using fMRI and behavioral measures, we examined identical actions (either right or left-hand button presses) performed for two different semantic intentions ('yes'/'no' response to questions regarding visual stimuli). Importantly, actions were devoid of differences in the immediate sensory outcome. Our findings revealed voxel patterns differentiating the two semantic goals in the frontoparietal cortex and visual pathways including the Lateral-occipital complex, in both hemispheres. Behavioral results suggest that the results cannot be explained by kinetic differences such as force. To the best of our knowledge, this is the first evidence showing that semantic meaning is embedded in the neural representation of actions independent of immediate sensory outcome and kinetic differences.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.