Task-switching training was shown to improve performance not only for the trained tasks (i.e., reduced performance costs resulting from the task switches), but also for structurally similar (near transfer) or even dissimilar tasks (far transfer). However, it is still unclear whether the improvement is specific to the trained input modality or whether cognitive control occurs at an amodal processing level enabling transfer of set-shifting abilities to different input modalities. In this study, training and transfer was assessed for an auditory task-switching paradigm in which spoken words from different semantic categories were presented dichotically requiring participants to switch between two auditory categorization tasks. Cross-modal transfer of taskswitching training was assessed in terms of the performance costs in a visual task-switching situation using tasks that were structurally similar to the trained tasks. The 4-day training significantly reduced the costs resulting from mixing the two auditory tasks, as compared to both an active (auditory single-task training) and a passive control group (no training). More importantly, the auditory task-switching training was also found to reduce the mixing costs for untrained visual tasks, indicating cross-modal transfer. This finding suggests that the improvement resulting from task-switching training is not specific to the trained stimulus modality, but it seems to be driven by a cognitive control mechanism operating at an amodal processing level. The training did not reveal any far-transfer effects to working memory, inhibition, or fluid intelligence, suggesting that the modality-independent enhancement of set-shifting does not generalize to other cognitive control functions.
The omnipresence of food cues in everyday life has been linked to troubled eating behavior and rising rates of obesity. While extended research has been conducted on the effects of negative emotions and stress on food consumption, very little is known about how positive emotions affect eating and particularly attention toward food cues. In the present study, we investigated whether humor impacts attentional bias toward food and whether it will affect preferences for healthy and unhealthy food items, depending on the hunger state. To do so, a group of randomly assigned participants watched funny video clips (humor group, N = 46) or neutral ones (control group, N = 49). Afterwards, they performed a modified Posner cueing task with low or high caloric food images serving as cues. We found a significant group × hunger interaction. Compared to the control group, the humor group responded more slowly to food cues when hungry, whereas the opposite was true when participants were satiated. Additionally, our results suggest that hunger possibly directs attention away from healthy food cues and toward unhealthy ones. No group differences were found with respect to food preferences and engagement and disengagement of attention. We discuss the potential of humor in counteracting aversive consequences of hunger on attention allocation toward food. We propose an underlying mechanism involving a combined reduction in cortisol levels and a decrease in activation of the reward system. However, given the novelty of the findings, further research is warranted, both to replicate the results as well as to investigate the suggested underlying processes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.