Earlier researchers were able to extract transient facial thermal features from thermal infrared images (TIRIs) to make binary distinction between the expressions of affective states. However, affective human-computer interaction might require machines to distinguish between the subtle facial expressions of affective states. This work, for the first time, attempts to use the transient facial thermal features to recognise a much wider range of facial expressions. A database of 324 time-sequential, visible-spectrum and thermal facial images was developed representing different facial expressions from 23 participants in different situations. A novel facial thermal feature extraction, selection and classification approach was developed and invoked on various Gaussian mixture models constructed using: neutral and pretended happy and sad faces; faces with multiple positive and negative facial expressions; faces with neutral and six (pretended) basic facial expressions; and faces with evoked happiness, sadness, disgust and anger. This work demonstrates that (1) infrared imaging can be used to observe affective-state-specific thermal variations on the face; (2) pixel-grey level analysis of TIRIs can help localise significant facial thermal feature points along the major facial muscles; and (3) cluster-analytic classification of transient thermal features can help distinguish between the facial expressions of affective states in an optimized eigenspace of input thermal feature vectors. The observed classification results also exhibited influence of a Gaussian mixture model's structure on classifier-performance. The work also unveiled some pertinent aspects of future research on the use of facial thermal features in automated facial expression classification and affect recognition.
Machines would require the ability to perceive and adapt to affects for achieving artificial sociability. Most autonomous systems use Automated Facial Expression Classification (AFEC) and Automated Affect Interpretation (AAI) to achieve sociability. Varying lighting conditions, occlusion, and control over physiognomy can influence the real life performance of vision-based AFEC systems. Physiological signals provide complementary information for AFEC and AAI. We employed transient facial thermal features for AFEC and AAI. Infrared thermal images with participants' normal expression and intentional expressions of happiness, sadness, disgust, and fear were captured. Facial points that undergo significant thermal changes with a change in expression termed as Facial Thermal Feature Points (FTFPs) were identified. Discriminant analysis was invoked on principal components derived from the Thermal Intensity Values (TIVs) recorded at the FTFPs. The cross-validation and person-independent classification respectively resulted in 66.28% and 56.0% success rates. Classification significance tests suggest that (1) like other physiological cues, facial skin temperature also provides useful information about affective states and their facial expression; (2) patterns of facial skin temperature variation can complement other cues for AFEC and AAI; and (3) infrared thermal imaging may help achieve artificial sociability in robots and autonomous systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.