2017 IEEE Second Ecuador Technical Chapters Meeting (ETCM) 2017
DOI: 10.1109/etcm.2017.8247472
|View full text |Cite
|
Sign up to set email alerts
|

Development of animated facial expressions to express emotions in a robot: RobotIcon

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(5 citation statements)
references
References 5 publications
0
5
0
Order By: Relevance
“…Currently, the evaluation of the degree of anthropomorphism in robot facial expressions is mainly achieved by obtaining the recognition rate through subject recognition experiments [ 40 , 41 ]. This evaluation method effectively measures the realism of the facial expressions achieved by the humanoid robot head.…”
Section: Discussionmentioning
confidence: 99%
“…Currently, the evaluation of the degree of anthropomorphism in robot facial expressions is mainly achieved by obtaining the recognition rate through subject recognition experiments [ 40 , 41 ]. This evaluation method effectively measures the realism of the facial expressions achieved by the humanoid robot head.…”
Section: Discussionmentioning
confidence: 99%
“…Dyck et al [38] discovered that the facial expressions created by virtual reality technology could hardly convey the sense of disgust, although they could easily convey the sense of sadness and fear. As pointed out by Danev et al [39], the mouth is the key element for conveying emotions, but may lead to confused emotional expression. Via the mouth, neutral emotions have the least recognition degree, while anger is easiest to be expressed.…”
Section: Conveying Emotion By Facial Expressionmentioning
confidence: 99%
“…Instead of real faces, a simplified comic style was adopted in the interface. This not only makes it easier to control facial expressions, but also avoids misinformation [39].…”
Section: Interface Prototype Design and Manipulation Checkmentioning
confidence: 99%
“…In general, a variety of different approaches to convey emotions exists. Most of the research on emotions in HRI focuses on expression of emotion using capabilities of the existing robots such as motion [57], facial expressions [6,11,28,55], color [30,53], sound [30,58], or touch [3]. However, to the best of our knowledge, incorporating emotions into human-robot interaction as an interaction modality and a way to communicate information, namely mission statements in the specific USAR context, has not been studied in the HRI community so far.…”
Section: Related Workmentioning
confidence: 99%