2016
DOI: 10.5772/62181
|View full text |Cite
|
Sign up to set email alerts
|

Humanoid Head Face Mechanism with Expandable Facial Expressions

Abstract: Recently a social robot for daily life activities is becoming more common. To this end a humanoid robot with realistic facial expression is a strong candidate for common chores. In this paper, the development of a humanoid face mechanism with a simplified system complexity to generate human like facial expression is presented. The distinctive feature of this face robot is the use of significantly fewer actuators. Only three servo motors for facial expressions and five for the rest of the head motions have been… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(12 citation statements)
references
References 27 publications
0
12
0
Order By: Relevance
“…Early works [10]- [16], [28] solely focus on hardware design of the robot face and pre-program the facial expressions. Kismet [20], [29]- [34] generates diverse facial expressions by interpolating among predefined basis facial postures over a three-dimensional space.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Early works [10]- [16], [28] solely focus on hardware design of the robot face and pre-program the facial expressions. Kismet [20], [29]- [34] generates diverse facial expressions by interpolating among predefined basis facial postures over a three-dimensional space.…”
Section: Related Workmentioning
confidence: 99%
“…The key limitation is the lack of a general learning framework that can learn from limited human supervision. Some traditional methods [10]- [16] define a set of pre-specified facial expressions. Others generalize this process to search for closest match from a database [17] or by following an fitness function [17], [18].…”
Section: Introductionmentioning
confidence: 99%
“…However, although numerous studies have developed androids for emotional interactions (Kobayashi and Hara, 1993;Kobayashi et al, 2000;Minato et al, 2004Minato et al, , 2006Minato et al, , 2007Weiguo et al, 2004;Ishihara et al, 2005;Matsui et al, 2005;Berns and Hirth, 2006;Blow et al, 2006;Hashimoto et al, 2006Hashimoto et al, , 2008Oh et al, 2006;Sakamoto et al, 2007;Lee et al, 2008;Takeno et al, 2008;Allison et al, 2009;Lin et al, 2009Lin et al, , 2016Kaneko et al, 2010;Becker-Asano and Ishiguro, 2011;Ahn et al, 2012;Mazzei et al, 2012;Tadesse and Priya, 2012;Cheng et al, 2013;Habib et al, 2014;Yu et al, 2014;Asheber et al, 2016;Glas et al, 2016;Marcos et al, 2016;Faraj et al, 2021;Nakata et al, 2021; Table 1), few have empirically validated the androids that were developed. First, no study validated androids' AUs coded using FACS (Ekman and Friesen, 1978;Ekman et al, 2002).…”
Section: Introductionmentioning
confidence: 99%
“…Prince and Suliman [1] proposed the Artificial Brain Emotion Recognition and Generation System (ABERGS) and described a method for recognizing the emotion from the human being and generating as a robotic mood state using Fuzzy Kohonen Clustering Network (FKCN) logic. Wagshum et al, [2] reported a new mechanism design for the humanoid robot head with the basic facial expressions. The approach is a cost effective one and with lesser complexities.…”
Section: Introductionmentioning
confidence: 99%