Human Computer Interaction (HCI) can be made more efficient if the interactive systems are able to respond to the users' emotional state. The foremost task for designing such systems is to recognize the users' emotional state during interaction. Most of the interactive systems, now a days, are being made touch enabled. In this work, we propose a model to recognize the emotional state of the users of touchscreen devices. We propose to compute the affective state of the users from 2D screen gesture using the number of touch events and pressure generated for each event as the only two features. No extra hardware setup is required for the computation. Machine learning technique was used for the classification. Four discriminative models, namely the Naïve Bayes, K-Nearest Neighbor (KNN), Decision Tree and Support Vector Machine (SVM) were explored, with SVM giving the highest accuracy of 96.75%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.