2017
DOI: 10.1038/s41598-017-13386-0
|View full text |Cite
|
Sign up to set email alerts
|

Music induced happy mood suppresses the neural responses to other’s pain: Evidences from an ERP study

Abstract: In the current study, we explored the time course of processing other’s pain under induced happy or sad moods. Event-related potentials (ERPs) were recorded when participants observing pictures showing others in painful or non-painful situations. Mood induction procedures were applied to the participants before the picture observation task. Happy and sad moods were induced by listening to about 10 minutes of music excerpts selected from the Chinese Affective Music System (CAMS). The ERP results revealed that t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
28
2
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(34 citation statements)
references
References 76 publications
2
28
2
2
Order By: Relevance
“…According to the grand-averaged ERP pictures, the topographical distribution and the literature [9, 30, 52, 53], five regions of interest from the frontal (the mean amplitudes of F3, Fz and F4), frontal-central (the mean amplitudes of FC3, FCz and FC4), central (the mean amplitudes of C3, Cz and C4), central-parietal (the mean amplitudes of CP3, CPz and CP4) and parietal (the mean amplitudes of P3, Pz and P4) regions were chosen. Analyses were conducted using the peak amplitudes of N1 (100~180 ms), P2 (180~240 ms), N2 (240~300 ms) and P3 (300~430 ms) and the mean amplitude of the LPC (430~650 ms) after the onset of the painful or nonpainful pictures [9, 16, 28]. A three-way, repeated-measures ANOVA with priming type (subliminal neutral, sad, fear eye region information), target type (painful, nonpainful pictures) and regions of interest was performed.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…According to the grand-averaged ERP pictures, the topographical distribution and the literature [9, 30, 52, 53], five regions of interest from the frontal (the mean amplitudes of F3, Fz and F4), frontal-central (the mean amplitudes of FC3, FCz and FC4), central (the mean amplitudes of C3, Cz and C4), central-parietal (the mean amplitudes of CP3, CPz and CP4) and parietal (the mean amplitudes of P3, Pz and P4) regions were chosen. Analyses were conducted using the peak amplitudes of N1 (100~180 ms), P2 (180~240 ms), N2 (240~300 ms) and P3 (300~430 ms) and the mean amplitude of the LPC (430~650 ms) after the onset of the painful or nonpainful pictures [9, 16, 28]. A three-way, repeated-measures ANOVA with priming type (subliminal neutral, sad, fear eye region information), target type (painful, nonpainful pictures) and regions of interest was performed.…”
Section: Methodsmentioning
confidence: 99%
“…As we know, N1 and N2 components reflect early, automatic, affective-sharing processes, people automatically share fearful emotion information rather than sad emotion information when they show empathy for pain [9]. Hence, our hypothesis was that the empathy for pain task with fearful emotion would elicit larger N1 and N2 components than the task with sad emotion.…”
Section: Introductionmentioning
confidence: 95%
See 1 more Smart Citation
“…This result was in line with several empirical findings that found a pain effect in the late LPC (Cui et al., 2016; Meng et al., 2013; Weng, 2010). The late LPC pain effect indicated a person’s further assessment for painful pictures because the late LPC component was generally associated with a top-down cognitive assessment processing of visual stimuli (Cheng, Jiao, Luo, & Cui, 2017; Wang et al., 2014). We failed to observe the pain effects in the EPN, P300, and early LPC component, which was inconsistent with our hypotheses.…”
Section: Discussionmentioning
confidence: 99%
“…Following other published work (Cheng, Jiao, Luo, & Cui, 2017;Meng et al, 2012;Mu & Han, 2013), electrodes were clustered into five sets: frontal (F3, Fz, F4), frontal-central (FC3, FCz, FC4), central (C3, Cz, C4), central-parietal (CP3, CPz, CP4) and parietal (P3, Pz, P4) regions. Amplitude measurement windows were identified using the "collapsed localizer" approach (Luck & Gaspelin, 2017).…”
Section: Differences In Event-related Potentials During Word Cue Prmentioning
confidence: 99%