Over the last years, neonatal face analysis has allowed the investigation and creation of non-invasive methods that enable classification of painful stimulus in newborns. In this context, changes on facial movements and expression have provided relevant scientific information and clinical significance, since they describe the presence of the pain itself perceived by the newborns. In this work, we propose and implement a computational framework that uses triangular meshes, with the goal of generating high resolution spatially normalized atlases potentially useful for automatic neonatal pain assessment.
Human faces convey a collection of information, such as gender, identity, and emotional states. Therefore, understanding the differences between volunteers’ eye movements on benchmark tests of face recognition and perception can explicitly indicate the most discriminating regions to improve performance in this visual cognitive task. The aim of this work is to qualify and classify these eye strategies using multivariate statistics and machine learning techniques, achieving up to 94.8% accuracy. Our experimental results show that volunteers have focused their visual attention, on average, at the eyes, but those with superior performance in the tests carried out have looked at the nose region more closely.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.