2022
DOI: 10.1016/j.eswa.2021.116101
|View full text |Cite
|
Sign up to set email alerts
|

Classification of Individual’s discrete emotions reflected in facial microexpressions using electroencephalogram and facial electromyogram

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(8 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…However, the EMG signals generated by activating specific forearm muscles can be conducted to the surroundings. [ 22 ] The EMG signals recorded by the two devices show similar patterns, as shown in Figure S7 (Supporting Information). Therefore, we confirm that these devices sat near the same muscle groups on the forearm.…”
Section: Resultsmentioning
confidence: 82%
See 1 more Smart Citation
“…However, the EMG signals generated by activating specific forearm muscles can be conducted to the surroundings. [ 22 ] The EMG signals recorded by the two devices show similar patterns, as shown in Figure S7 (Supporting Information). Therefore, we confirm that these devices sat near the same muscle groups on the forearm.…”
Section: Resultsmentioning
confidence: 82%
“…Then, the signals are further processed using a fourth‐order Butterworth bandpass filter with 20 and 450 Hz cutoff frequencies. [ 22,24 ] The filtered EMG signals are divided into short segments using a 300‐ms sliding window (600 samples) with 96% overlap. For each segment, a spatial covariance matrix (SCM) was extracted.…”
Section: Resultsmentioning
confidence: 99%
“…Most of the works using EMG for the recognition of emotional reactions focus on the analysis of facial expressions. For example, Kim et al [ 64 ] explored the use of facial EMG and EEG signals for the classification of the emotions of happiness, surprise, fear, anger, sadness, and disgust. Mithbavkar et al [ 65 ] developed a dataset for emotion recognition based on data collected through electromyograms using dance to stimulate emotional responses such as astonishment, awe, humor, and tranquility.…”
Section: Emotion Recognitionmentioning
confidence: 99%
“…Extracting features from fEMG data entails identifying distinct elements within the signal that reflect different emotional states. Several conventional methods for feature extraction in fEMG analysis encompass time-domain [5][6][7][8][9][10], frequency-domain [11], and time-frequency domain [12,13]. Computing time, frequency, and time-frequency features can provide valuable insights and information about the characteristics of signals, data, and phenomena, allowing for better understanding, analysis, and decision-making [14].…”
Section: Introductionmentioning
confidence: 99%