2017
DOI: 10.1504/ijbet.2017.082224
|View full text |Cite
|
Sign up to set email alerts
|

Emotional eye movement analysis using electrooculography signal

Abstract: Abstract:In this study, for recognition of (positive, neutral and negative) emotions using EOG signals, subjects were stimulated with audio-visual stimulus to elicit emotions. Hjorth parameters and Discrete Wavelet Transform (DWT) (Haar mother wavelet) were employed as feature extractor. Support Vector Machine (SVM) and Naïve Bayes (NB) were used for classifying the emotions. The results of multiclass classifications in terms of classification accuracy show best performance with the combination DWT+SVM and Hjo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(9 citation statements)
references
References 12 publications
0
7
0
Order By: Relevance
“…The least successful approaches utilized only pupil diameter achieving highly similar and low accuraries of 58.9% [42] and 59.0% [36], respectively. The most commonly used feature (eight studies) was pupil diameter [31,36,42,57,86,94,96,101], followed by fixation duration employed in four studies [31,73,84,101], and finally the least used features were pupil position [57,94] and EOG [53,54], which were used in only two studies each, respectively. The speed of the emotion recognition task was only reported in one of the studies, which could provide classification results within 2 s (with 10% variation) of the presentation of the emotional stimuli [94].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The least successful approaches utilized only pupil diameter achieving highly similar and low accuraries of 58.9% [42] and 59.0% [36], respectively. The most commonly used feature (eight studies) was pupil diameter [31,36,42,57,86,94,96,101], followed by fixation duration employed in four studies [31,73,84,101], and finally the least used features were pupil position [57,94] and EOG [53,54], which were used in only two studies each, respectively. The speed of the emotion recognition task was only reported in one of the studies, which could provide classification results within 2 s (with 10% variation) of the presentation of the emotional stimuli [94].…”
Section: Discussionmentioning
confidence: 99%
“…In Paul et al [54], the authors used the audio-visual stimulus to recognize emotion using EOG signals with the Hjorth parameter and a time-frequency domain feature extraction method, which was the Discrete Wavelet Transform (DWT) [55]. They used two classifiers in their study to obtain the classification which was SVM and Naïve Bayes (NB) with Hjorth [56].…”
Section: Electrooculography (Eog)mentioning
confidence: 99%
“…The scientific research presented in the Table 11 embraces EOG technology for emotion recognition and their intensity evaluation. A majority of techniques separate positive and negative emotion levels (see [164][165][166]). The evaluation of emotion intensity level remains uncertain for many cases and not comprehensibly described.…”
Section: Electrooculography (Eog)mentioning
confidence: 99%
“…However, studies that focus specifically on emotion classification using eye-tracking data alone is very limited as most of such studies incorporate other sensor modalities such as EEG and ECG. There is a study that focuses on the emotional eye movement analysis using electrooculography (EOG) signals [20]. There have also been studies that rely on other types of eyetracking data such as fixation duration and pupil position [1,23].…”
Section: Eye-tracking In Emotion Classificationmentioning
confidence: 99%