Purpose
Dyadic interactions are significant for human life. Most body sensor networks-based research studies focus on daily actions, but few works have been done to recognize affective actions during interactions. The purpose of this paper is to analyze and recognize affective actions collected from dyadic interactions.
Design/methodology/approach
A framework that combines hidden Markov models (HMMs) and k-nearest neighbor (kNN) using Fisher kernel learning is presented in this paper. Furthermore, different features are considered according to the interaction situations (positive situation and negative situation).
Findings
Three experiments are conducted in this paper. Experimental results demonstrate that the proposed Fisher kernel learning-based framework outperforms methods using Fisher kernel-based approach, using only HMMs and kNN.
Practical implications
The research may help to facilitate nonverbal communication. Moreover, it is important to equip social robots and animated agents with affective communication abilities.
Originality/value
The presented framework may gain strengths from both generative and discriminative models. Further, different features are considered based on the interaction situations.