2019
DOI: 10.1002/npr2.12059
|View full text |Cite
|
Sign up to set email alerts
|

Characteristics of facial muscle activity during voluntary facial expressions: Imaging analysis of facial expressions based on myogenic potential data

Abstract: Purpose Facial expressions are formed by the coordination of facial muscles and reflect changes in emotion. Nurses observe facial expressions as way of understanding patients. This study conducted basic research using facial myogenic potential topography to visually determine changes in the location and strength of facial muscle activity associated with voluntary facial expression to examine relationships with facial expressions. Methods Participants comprised 18 healthy adults (6 men, 12 women; mean age, 24.3… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(19 citation statements)
references
References 20 publications
0
19
0
Order By: Relevance
“…Overall, both sEMG schemes allowed to record of characteristic and discriminable facial muscle activation patterns during different facial movement tasks. The geometric sEMG recording from the entire face introduced by Kuramoto et al seemed to allow more specific detection of facial muscle activity patterns during facial movement tasks ( Kuramoto et al, 2019 ). As shown by others, there was no relevant side difference ( Schumann et al, 2010 , 2021 ; Kuramoto et al, 2019 ; Cui et al, 2020 ).…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Overall, both sEMG schemes allowed to record of characteristic and discriminable facial muscle activation patterns during different facial movement tasks. The geometric sEMG recording from the entire face introduced by Kuramoto et al seemed to allow more specific detection of facial muscle activity patterns during facial movement tasks ( Kuramoto et al, 2019 ). As shown by others, there was no relevant side difference ( Schumann et al, 2010 , 2021 ; Kuramoto et al, 2019 ; Cui et al, 2020 ).…”
Section: Discussionmentioning
confidence: 99%
“…The sEMG recording was performed with a multichannel EMG system (gain: 100, frequency range: 10–1,861 Hz; sampling rate: 4,096/s; resolution: 5.96 nV/bit; DeMeTec, Langgöns, Germany). Electromyograms were recorded from both sides of the face simultaneously using both the arrangement of electrodes by Fridlund and Cacioppo (1986) and by Kuramoto et al (2019) (cf. Figures 1C–G ).…”
Section: Methodsmentioning
confidence: 99%
“…A number of studies have explored the physiological basis of how pain signaling leads to pain-indicative muscle movement. Kuramoto et al recently used facial myogenic potential topography in 18 healthy adult participants to investigate the facial myogenic potential and subsequent facial expressions ( 25 ). Furthermore, Kunz used functional MRI (fMRI) to address the association between brain responses in areas that processed the sensory dimension of pain and activation of the orbicularis oculi muscle ( 26 ).…”
Section: Discussionmentioning
confidence: 99%
“…Furthermore, the analysis was performed only in men. Although facial sEMG activation patterns were generated for men and women in some studies [6,8,17], gender differences were not yet analyzed. Therefore, we cannot rule out that women show other facial muscle expression patterns.…”
Section: Plos Onementioning
confidence: 99%
“…Fridlund and Cacioppo therefore recommended for psychophysiological experiments the sEMG recordings from ten facial muscles [7]. Recently, Kuramoto et al recommended the use of 24 sEMG electrodes in an EEG-like arrangement over the facial independent of the underlying muscles and a myogenic potential topogram analysis [8]. Experimental high-density sEMG (HD sEMG) with 90 electrodes even allows a description of facial muscle activation with activation maps projected on the facial surface [6].…”
Section: Introductionmentioning
confidence: 99%