Despite extensive research on face perception, few studies have investigated individuals’ knowledge about the physical features of their own face. In this study, 50 participants indicated the location of key features of their own face, relative to an anchor point corresponding to the tip of the nose, and the results were compared to the true location of the same individual’s features from a standardised photograph. Horizontal and vertical errors were analysed separately. An overall bias to underestimate vertical distances revealed a distorted face representation, with reduced face height. Factor analyses were used to identify separable subconfigurations of facial features with correlated localisation errors. Independent representations of upper and lower facial features emerged from the data pattern. The major source of variation across individuals was in representation of face shape, with a spectrum from tall/thin to short/wide representation. Visual identification of one’s own face is excellent, and facial features are routinely used for establishing personal identity. However, our results show that spatial knowledge of one’s own face is remarkably poor, suggesting that face representation may not contribute strongly to self-awareness.
In the character animation industry, animators use facial UI's to animate a character's face. A facial UI provides widgets and handles that the animator interacts with to control the character's facial regions. This paper presents a facial UI design approach to control the animation of the six basic facial expressions of the anthropomorphic face. The design is based in square shaped widgets [5] holding circular handles that allow the animator to produce the muscular activity relative to the basic facial expressions [1]. We have implemented a prototype of the facial UI design in the Blender open-source animation software and did a preliminary pilot study with three animators. Two parameters were evaluated: the number of clicks and the time taken to animate the six basic facial expressions. The study reveals there was little variation in the values each animator marked for both parameters, despite the natural difference in their creative performance.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.