The error (-related) negativity (Ne or ERN) has been related to detecting the mismatch between incorrectly executed and appropriate responses or, alternatively, to the degree of conflict between different response alternatives. In this study different levels of response conflict were generated by manipulating task difficulty in a Simon task. According to the product of incorrect and subsequent correct EMG activation, the amount of conflict in error trials was indeed larger for the easy than for the hard condition. In contrast, Ne/ERN amplitudes did not differ between difficulty conditions, nor was the amount of conflict mirrored by Ne/ERN amplitude. Therefore, the present data are at variance with the hypothesis that the Ne/ERN reflects the degree of response conflict.
Overt articulation produces strong artifacts in the electroencephalogram and in event-related potentials (ERPs), posing a serious problem for investigating language production with these variables. Here we describe the properties of articulation-related artifacts and propose a novel correction procedure. Experiment 1 co-recorded ERPs and trajectories of the articulators with an electromagnetic articulograph from a single participant. The generalization of the findings from the single participant to standard picture naming was investigated in Experiment 2. Both experiments provided evidence that articulation-induced artifacts may start up to 300 ms or more prior to voice onset or voice key onset-depending on the specific measure; they are highly similar in topography across many different phoneme patterns and differ mainly in their time course and amplitude. ERPs were separated from articulation-related artifacts with residue iteration decomposition (RIDE). After obtaining the artifact-free ERPs, their correlations with the articulatory trajectories dropped near to zero. Artifact removal with independent component analysis was less successful; while correlations with the articulatory movements remained substantial, early components prior to voice onset were attenuated in reconstructed ERPs. These findings offer new insights into the nature of articulation artifacts; together with RIDE as method for artifact removal the present report offers a fresh perspective for ERP studies requiring overt articulation.
The antisaccade task has proven highly useful in basic and clinical neuroscience, and the neural structures involved are well documented. However, the cognitive and neural mechanisms that mediate task performance are not yet understood. An event-related fMRI study was designed to dissociate the neural correlates of two putative key functions, volitional saccade generation and inhibition of reflexive saccades, and to investigate their interaction. Nineteen healthy volunteers performed a task that required (a) to initiate saccades volitionally, either with or without a simultaneous demand to inhibit a reflexive saccade; and (b) to inhibit a reflexive saccade, either with or without a simultaneous demand to initiate a saccade volitionally. Analysis of blood oxygen level-dependent signal changes confirmed a major role of the frontal eye fields and the supplementary eye fields in volitional saccade generation. Inhibition-related activation of a specific fronto-parietal network was highly consistent with previous evidence involved in inhibitory processes. Unexpectedly, there was little evidence of specific brain activation during combined generation and inhibition demands, suggesting that the neural processing of generation and inhibition in antisaccades is independent to a large extent.
Coding of facial emotion expressions is increasingly performed by automated emotion expression scoring software; however, there is limited discussion on how best to score the resulting codes. We present a discussion of facial emotion expression theories and a review of contemporary emotion expression coding methodology. We highlight methodological challenges pertinent to scoring software-coded facial emotion expression codes and present important psychometric research questions centered on comparing competing scoring procedures of these codes. Then, on the basis of a time series data set collected to assess individual differences in facial emotion expression ability, we derive, apply, and evaluate several statistical procedures, including four scoring methods and four data treatments, to score software-coded emotion expression data. These scoring procedures are illustrated to inform analysis decisions pertaining to the scoring and data treatment of other emotion expression questions and under different experimental circumstances. Overall, we found applying loess smoothing and controlling for baseline facial emotion expression and facial plasticity are recommended methods of data treatment. When scoring facial emotion expression ability, maximum score is preferred. Finally, we discuss the scoring methods and data treatments in the larger context of emotion expression research.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.