In this paper, we investigate whether information related to touches and rotations impressed to an object can be effectively used to classify the emotion of the agent manipulating it. We specifically focus on sequences of basic actions (e.g., grasping, rotating), which are constituents of daily interactions. We use the iCube, a 5 cm cube covered with tactile sensors and embedded with an accelometer, to collect a new dataset including 11 persons performing action sequences associated with 4 emotions: anger, sadness, excitement and gratitude. Next, we propose 17 high-level hand-crafted features based on the tactile and kinematics data derived from the iCube. Twelve of these features vary significantly as a function of the emotional context in which the action sequence was performed. In particular, a larger surface of the object is engaged in physical contact for anger and excitement, than for sadness. Furthermore, the average duration of interactions labeled as sad, is longer than for the remaining 3 emotions. More rotations are performed for anger and excitement than for sadness and gratitude. The accuracy of a classification experiment in the case of four emotions reaches 0.75. This result shows that the emotion recognition during handobject interactions is possible and it may foster development of new intelligent user interfaces.
CCS CONCEPTS• Human-centered computing → Human computer interaction (HCI); Interaction devices.