2014
DOI: 10.1080/01691864.2013.867290
|View full text |Cite
|
Sign up to set email alerts
|

Modeling and design of a humanoid robotic face based on an active drive points model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
8
2

Relationship

3
7

Authors

Journals

citations
Cited by 16 publications
(10 citation statements)
references
References 5 publications
0
10
0
Order By: Relevance
“…The displacements of these feature points were obtained from video images and compared to verify if the android could replicate the facial expressions successfully. Yu et al (2014) investigated these basic emotions by comparing 13 facial feature points for a human male and his replica android with an optical motion capture system and calculated the average difference between the three-dimensional displacements as a similarity index of their facial deformations. The above studies analyzed the facial deformations as sparse distributions of displacement vectors.…”
Section: Introductionmentioning
confidence: 99%
“…The displacements of these feature points were obtained from video images and compared to verify if the android could replicate the facial expressions successfully. Yu et al (2014) investigated these basic emotions by comparing 13 facial feature points for a human male and his replica android with an optical motion capture system and calculated the average difference between the three-dimensional displacements as a similarity index of their facial deformations. The above studies analyzed the facial deformations as sparse distributions of displacement vectors.…”
Section: Introductionmentioning
confidence: 99%
“…Up to now, the team in IRI, BIT, has developed 6 generations of BHR humanoid robots, from BHR-1 to BHR-6 (Huang et al, 2002;Huang et al, 2007;Yu et al, 2014b;Meng et al, 2015).…”
Section: Introductionmentioning
confidence: 99%
“…Robot faces are important information display devices that show several types of communication cues such as intention, attention, emotion, and demand, with the combined deformations of several facial parts. Movable mechanical parts of the android robot's face are covered with a flexible skin-like sheet to exhibit spatially-continuous lifelike surface deformations (Kobayashi et al, 1994 , 2000 , 2003 ; Weiguo et al, 2004 ; Hanson et al, 2005 ; Berns and Hirth, 2006 ; Hashimoto et al, 2006 , 2008 ; Oh et al, 2006 ; Lee et al, 2008 ; Allison et al, 2009 ; Becker-Asano et al, 2010 ; Becker-Asano and Ishiguro, 2011 ; Tadesse and Priya, 2012 ; Cheng et al, 2013 ; Chihara et al, 2013 ; Yu et al, 2014 ; Asheber et al, 2016 ; Glas et al, 2016 ; Lin et al, 2016 ). The skin sheet is supported by skull-shaped shells to maintain a life-like shape and is connected to internal movable mechanical parts at several points.…”
Section: Introductionmentioning
confidence: 99%