2016
DOI: 10.1007/s00500-016-2395-4
|View full text |Cite
|
Sign up to set email alerts
|

Edge-enhanced bi-dimensional empirical mode decomposition-based emotion recognition using fusion of feature set

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(2 citation statements)
references
References 35 publications
0
2
0
Order By: Relevance
“…For this purpose, we can use the so-called "multi-point masks". Individual points on this mask represent the extracted areas from which the classification of the emotional state is performed [7]. Because of the amount of individual parts on this mask that can be scanned simultaneously and the amount of output data thus produced, there have been developed various methods for detecting, extracting and subsequently classifying the emotional state either in real time or using stored face images.…”
Section: Introductionmentioning
confidence: 99%
“…For this purpose, we can use the so-called "multi-point masks". Individual points on this mask represent the extracted areas from which the classification of the emotional state is performed [7]. Because of the amount of individual parts on this mask that can be scanned simultaneously and the amount of output data thus produced, there have been developed various methods for detecting, extracting and subsequently classifying the emotional state either in real time or using stored face images.…”
Section: Introductionmentioning
confidence: 99%
“…After it was proposed, the empirical mode decomposition (EMD) method was also introduced into image fusion as the same as preceding conventional transform methods, including medical image fusion [27]- [30]. EMD can decompose source image into a residue to present the approximate representation and a set of intrinsic mode functions (IMFs) to describe the details of the image.…”
mentioning
confidence: 99%