Background: The arbitration between decision-making strategies is shaped by the degree of controllability over environmental events. Under low control, individuals might rely more heavily on Pavlovian bias (PB), which facilitates and inhibits actions when facing appetitive and aversive cues, respectively. More recently, extreme PB was implicated in learned helplessness (LH), which is typically induced by uncontrollable punishment. On the neural level, the medial prefrontal cortex (mPFC) was pinpointed as a region underlying both cognitive control over PB, and the pathogenesis of LH.Objective/Hypothesis: To test if high-definition transcranial direct current stimulation (HD-tDCS) targeting the mPFC counteracts with the deleterious behavioral effects of low controllability over rewards/losses (“yoking”) during reinforcement learning.Methods: In a pre-registered, between-group, double-blind study (N = 103, healthy adults), we tested the interaction of low controllability and HD-tDCS on performance in a Go/NoGo task. Yoking was implemented by presenting random outcomes following responses, while matching reward/loss frequencies between control and yoked groups. HD-tDCS was delivered for 15 minutes at 2 mA using a 4x1 montage centered at position Fz.Results: HD-tDCS improved response accuracy by the end of the task only when applied simultaneously with yoking. The beneficial consequences of active stimulation in yoked participants were more pronounced in reward-predictive trials. Finally, computational modeling revealed that parameter estimates of learning rate and choice randomness were modulated by yoking and HD-tDCS in an interactive manner.Conclusions: These results highlight the potential of our HD-tDCS protocol for interfering with choice arbitration in volatile environments, resulting in more adaptive behavior.
Cross-modal integration is ubiquitous within perception and, in humans, the McGurk effect demonstrates that seeing a person articulating speech can change what we hear into a new auditory percept. It remains unclear whether cross-modal integration of sight and sound generalizes to other visible vocal articulations like those made by singers. We surmise that perceptual integrative effects should involve music deeply, since there is ample indeterminacy and variability in its auditory signals. We show that switching videos of sung musical intervals changes systematically the estimated distance between two notes of a musical interval so that pairing the video of a smaller sung interval to a relatively larger auditory led to compression effects on rated intervals, whereas the reverse led to a stretching effect. In addition, after seeing a visually switched video of an equally-tempered sung interval and then hearing the same interval played on the piano, the two intervals were judged often different though they differed only in instrument. These findings reveal spontaneous, cross-modal, integration of vocal sounds and clearly indicate that strong integration of sound and sight can occur beyond the articulations of natural speech.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.