2011
DOI: 10.1007/978-3-642-21793-7_73
|View full text |Cite
|
Sign up to set email alerts
|

Emotional Human-Machine Interaction: Cues from Facial Expressions

Abstract: Emotion detection provides a promising basis for designing futureoriented human centered design of Human-Machine Interfaces. Affective Computing can facilitate human-machine communication. Such adaptive advanced driver assistance systems (ADAS) which are dependent on the emotional state of the driver can be applied in cars. In contrast to the majority of former studies that only used static recognition methods, we investigated a new dynamic approach for detecting emotions in facial expressions in an artificial… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
8
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 20 publications
0
8
0
Order By: Relevance
“…In 2012 [ 23 ], Alvarez et al introduced emotional adaptive vehicle user interfaces using logistic model trees, multilayer perceptron, naive Bayes, and logistic regression methods to predict the driver’s emotions using speech signals. In 2011, Tews et al [ 24 ] proposed an emotional human–machine interaction system using a statistical variance method to predict the emotions from the face. Paschero et al [ 25 ] in 2012 introduced a real-time classifier for a vehicle driver’s emotion recognition from the face using the multi-layer perceptron method to classify emotions like happiness, anger, fear, sadness, disgust, and surprise.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In 2012 [ 23 ], Alvarez et al introduced emotional adaptive vehicle user interfaces using logistic model trees, multilayer perceptron, naive Bayes, and logistic regression methods to predict the driver’s emotions using speech signals. In 2011, Tews et al [ 24 ] proposed an emotional human–machine interaction system using a statistical variance method to predict the emotions from the face. Paschero et al [ 25 ] in 2012 introduced a real-time classifier for a vehicle driver’s emotion recognition from the face using the multi-layer perceptron method to classify emotions like happiness, anger, fear, sadness, disgust, and surprise.…”
Section: Related Workmentioning
confidence: 99%
“…Among all these works, some results [ 15 , 25 , 26 , 28 , 31 , 33 ] have proposed systems running in a non-car environment, whereas works [ 20 , 29 , 37 , 40 , 41 , 42 ] have been conducted in a real-time environment. Some results [ 14 , 16 , 17 , 18 , 24 , 30 , 38 , 39 ] have used a simulator environment.…”
Section: Related Workmentioning
confidence: 99%
“…Therefore, detecting and mitigating driver emotions by using affective computing in an emotion-aware system may ensure driving safety (Ihme et al, 2019). One idea of such a system is to interpret the user's emotional state and provide assistance to support users to reduce the negative consequences of certain emotional states (Klein et al, 2002;Tews et al, 2011;Jeon, 2015;Löcken et al, 2017;Ihme et al, 2018). Furthermore, in the context of high-level automated driving functions, an automated assessment of emotions could allow adapting driving styles or warnings to the drivers' current emotional state to maximize drivers' comfort and optimize the driving experience (Techer et al, 2019).…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, detecting and mitigating driver emotions by using affective computing in an emotion-aware system may ensure driving safety (Ihme et al, 2019 ). One idea of such a system is to interpret the user's emotional state and provide assistance to support users to reduce the negative consequences of certain emotional states (Klein et al, 2002 ; Tews et al, 2011 ; Jeon, 2015 ; Löcken et al, 2017 ; Ihme et al, 2018 ). Furthermore, in the context of high-level automated driving functions, an automated assessment of emotions could allow adapting driving styles or warnings to the drivers' current emotional state to maximize drivers' comfort and optimize the driving experience (Techer et al, 2019 ).…”
Section: Introductionmentioning
confidence: 99%