2009
DOI: 10.1145/1658866.1658877
|View full text |Cite
|
Sign up to set email alerts
|

An extensible framework for interactive facial animation with facial expressions, lip synchronization and eye behavior

Abstract: In this article we describe our approach to generating convincing and empathetic facial animation. Our goal is to develop a robust facial animation platform that is usable and can be easily extended. We also want to facilitate the integration of research in the area and to directly incorporate the characters in interactive applications such as embodied conversational agents and games. We have developed a framework capable of easily animating MPEG-4 parameterized faces through highlevel description of facial ac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
13
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 33 publications
0
13
0
Order By: Relevance
“…It explores the functionalities of the VHFace and Webcam Manager modules, using facial feature detector algorithms (for face, eye and mouth regions) from OpenCV [Viola and Jones 2001;Castrillón et al 2007]. The mapping of the detected features to simple events (opened mouth, closed mouth, smile and direction of the horizontal gaze) and then to FAP animations is implemented according to the methodology of Queiroz et al [Queiroz et al 2009]. Once information from face components is acquired, it is used to animate the avatar's face, as shown in Figures 1 and 11.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…It explores the functionalities of the VHFace and Webcam Manager modules, using facial feature detector algorithms (for face, eye and mouth regions) from OpenCV [Viola and Jones 2001;Castrillón et al 2007]. The mapping of the detected features to simple events (opened mouth, closed mouth, smile and direction of the horizontal gaze) and then to FAP animations is implemented according to the methodology of Queiroz et al [Queiroz et al 2009]. Once information from face components is acquired, it is used to animate the avatar's face, as shown in Figures 1 and 11.…”
Section: Resultsmentioning
confidence: 99%
“…The facial animation module, called VHFace Manager, integrates the framework proposed by Queiroz et al [Queiroz et al 2009], based on XFace core libraries [Balci 2004]. The framework follows the MPEG-4 Facial Animation (FA) standard [Pandzic and Forchheimer 2003] for parameterization and animation of faces.…”
Section: Vhface -Facial Animationmentioning
confidence: 99%
See 2 more Smart Citations
“…Expressive visual speech being the emotional manifestation of a personality requires efficient and pragmatic communication. Advanced research on facial animation mainly focussed to understanding clearly the animated avatar that explores the expressiveness, communication and interactivity aspects of ECA development [31]. Human face being the mirror of internal sensation is undoubtedly the most significant art object and pivotal part that plays a meaningful role in this interaction process.…”
Section: Introductionmentioning
confidence: 99%