2019
DOI: 10.3389/fnbot.2019.00046
|View full text |Cite
|
Sign up to set email alerts
|

Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements

Abstract: Giving a robot the ability to perceive emotion in its environment can improve human-robot interaction (HRI), thereby facilitating more human-like communication. To achieve emotion recognition in different built environments for adolescents, we propose a multi-modal emotion intensity perception method using an integration of electroencephalography (EEG) and eye movement information. Specifically, we first develop a new stimulus video selection method based on computation of normalized arousal and valence scores… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(2 citation statements)
references
References 45 publications
0
2
0
Order By: Relevance
“…Based on the conditions during the signal acquisition process, further steps were taken, such as filtration and channel selection. This makes a difference, as documented by [ 22 , 58 ]. The results show the potential usability of the presented framework in real-time applications and is a step towards enhanced motion prediction in BMI applications.…”
Section: Resultsmentioning
confidence: 97%
“…Based on the conditions during the signal acquisition process, further steps were taken, such as filtration and channel selection. This makes a difference, as documented by [ 22 , 58 ]. The results show the potential usability of the presented framework in real-time applications and is a step towards enhanced motion prediction in BMI applications.…”
Section: Resultsmentioning
confidence: 97%
“…When it comes to multimodal feature extraction, the initial approach is to fuse the countenance data with audio signals, trying to improve the reliability and effectiveness of emotion computing. Su et al [28] synthetized the EEG information with eye movement signals, and classified emotions with a multi-modal DL deep neural network (DNN). Kessous et al [29] fused the extracted speech and countenance features, and classified emotions using an SVM.…”
Section: Related Workmentioning
confidence: 99%