Even more than in cognitive research applications, moving fMRI to the clinic and the drug development process requires the generation of stable and reliable signal changes. The performance characteristics of the fMRI paradigm constrain experimental power and may require different study designs (e.g., crossover vs. parallel groups), yet fMRI reliability characteristics can be strongly dependent on the nature of the fMRI task. The present study investigated both within-subject and group-level reliability of a combined three-task fMRI battery targeting three systems of wide applicability in clinical and cognitive neuroscience: an emotional (face matching), a motivational (monetary reward anticipation) and a cognitive (n-back working memory) task. A group of 25 young, healthy volunteers were scanned twice on a 3T MRI scanner with a mean test-retest interval of 14.6 days. FMRI reliability was quantified using the intraclass correlation coefficient (ICC) applied at three different levels ranging from a global to a localized and fine spatial scale: (1) reliability of group-level activation maps over the whole brain and within targeted regions of interest (ROIs); (2) within-subject reliability of ROI-mean amplitudes and (3) within-subject reliability of individual voxels in the target ROIs. Results showed robust evoked activation of all three tasks in their respective target regions (emotional task=amygdala; motivational task=ventral striatum; cognitive task=right dorsolateral prefrontal cortex and parietal cortices) with high effect sizes (ES) of ROI-mean summary values (ES=1.11-1.44 for the faces task, 0.96-1.43 for the reward task, 0.83-2.58 for the n-back task). Reliability of group level activation was excellent for all three tasks with ICCs of 0.89-0.98 at the whole brain level and 0.66-0.97 within target ROIs. Within-subject reliability of ROI-mean amplitudes across sessions was fair to good for the reward task (ICCs=0.56-0.62) and, dependent on the particular ROI, also fair-to-good for the n-back task (ICCs=0.44-0.57) but lower for the faces task (ICC=-0.02-0.16). In conclusion, all three tasks are well suited to between-subject designs, including imaging genetics. When specific recommendations are followed, the n-back and reward task are also suited for within-subject designs, including pharmaco-fMRI. The present study provides task-specific fMRI reliability performance measures that will inform the optimal use, powering and design of fMRI studies using comparable tasks.
Two incompatible pictures compete for perceptual dominance when they are presented to one eye each. This so-called binocular rivalry results in an alternation of dominant and suppressed percepts. In accordance with current theories of emotion processing, the authors' previous research has suggested that emotionally arousing pictures predominate in this perceptual process. Three experiments were run with pictures of emotional facial expressions that are known to induce emotions while being well controlled in terms of physical characteristics. In Experiment 1, photographs of emotional and neutral facial expressions were presented of the same actor to minimize physical differences. In Experiment 2, schematic emotional expressions were presented to further eliminate low-level differences. In Experiment 3, a probe-detection task was conducted to control for possible response-biases. Together, these data clearly demonstrate that emotional facial expressions predominate over neutral expressions; they are more often the first percept and they are perceived for longer durations. This is not caused by physical stimulus properties or by response-biases. This novel approach supports that emotionally significant visual stimuli are preferentially perceived.
Our first impression of others is highly influenced by their facial appearance. However, the perception and evaluation of faces is not only guided by internal features such as facial expressions, but also highly dependent on contextual information such as secondhand information (verbal descriptions) about the target person. To investigate the time course of contextual influences on cortical face processing, event-related brain potentials were investigated in response to neutral faces, which were preceded by brief verbal descriptions containing cues of affective valence (negative, neutral, positive) and self-reference (self-related vs. other-related). ERP analysis demonstrated that early and late stages of face processing are enhanced by negative and positive as well as self-relevant descriptions, although faces per se did not differ perceptually. Affective ratings of the faces confirmed these findings. Altogether, these results demonstrate for the first time both on an electrocortical and behavioral level how contextual information modifies early visual perception in a top-down manner.
Numerous studies have shown that humans automatically react with congruent facial reactions, i.e., facial mimicry, when seeing a vis-á-vis' facial expressions. The current experiment is the first investigating the neuronal structures responsible for differences in the occurrence of such facial mimicry reactions by simultaneously measuring BOLD and facial EMG in an MRI scanner. Therefore, 20 female students viewed emotional facial expressions (happy, sad, and angry) of male and female avatar characters. During picture presentation, the BOLD signal as well as M. zygomaticus major and M. corrugator supercilii activity were recorded simultaneously. Results show prototypical patterns of facial mimicry after correction for MR-related artifacts: enhanced M. zygomaticus major activity in response to happy and enhanced M. corrugator supercilii activity in response to sad and angry expressions. Regression analyses show that these congruent facial reactions correlate significantly with activations in the IFG, SMA, and cerebellum. Stronger zygomaticus reactions to happy faces were further associated to increased activities in the caudate, MTG, and PCC. Corrugator reactions to angry expressions were further correlated with the hippocampus, insula, and STS. Results are discussed in relation to core and extended models of the mirror neuron system (MNS).
Visual emotional stimuli evoke enhanced activation in early visual cortex areas which may help organisms to quickly detect biologically salient cues and initiate appropriate approach or avoidance behavior. Functional neuroimaging evidence for the modulation of other sensory modalities by emotion is scarce. Therefore, the aim of the present study was to test whether sensory facilitation by emotional cues can also be found in the auditory domain. We recorded auditory brain activation with functional near-infrared-spectroscopy (fNIRS), a non-invasive and silent neuroimaging technique, while participants were listening to standardized pleasant, unpleasant, and neutral sounds selected from the International Affective Digitized Sound System (IADS). Pleasant and unpleasant sounds led to increased auditory cortex activation as compared to neutral sounds. This is the first study to suggest that the enhanced activation of sensory areas in response to complex emotional stimuli is apparently not restricted to the visual domain but is also evident in the auditory domain.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.