BackgroundEmotion recognition technology plays the essential role of enhancement in Human-Computer Interaction (HCI). In recent years, a novel approach for emotion recognition has been reported, which is by keystroke dynamics. This approach can be considered to be rather desirable in HCI because the data used is rather non-intrusive and easy to obtain. However, there were only limited investigations about the phenomenon itself in previous studies. This study aims to examine the source of variance in keystroke typing patterns caused by emotions.MethodsA controlled experiment to collect subjects’ keystroke data in different emotional states induced by International Affective Picture System (IAPS) was conducted. Two-way Valence (3) × Arousal (3) ANOVAs were used to examine the collected dataset.ResultsThe results of the experiment indicate that the effect of emotion is significant (p < .001) in the keystroke duration, keystroke latency, and accuracy rate of the keyboard typing. However, the size of the emotional effect is small, compare to the individual variability.ConclusionsOur findings support the conclusion that the keystroke duration, keystroke latency, and also the accuracy rate of typing, are influenced by emotional states. Notably, the finding about the size of effect suggests that the accuracy rate of the emotion recognition could be further improved if personalized models are utilized. On the other hand, the finding also provides an explanation of why real-world applications which authenticate the identity of users by monitoring keystrokes may not be interfered by the emotional states of users. The experiment was conducted using standard instruments and hence is expected to be highly reproducible.
This study applied Hilbert-Huang Transform (HHT) on spatial-frequency analysis on affective picture classification. The obtained results demonstrate the existence of affective characteristics in spatial-frequency domain of an image, and also found the horizontal visual stimulations are slightly more effective than vertical visual stimulation on emotion elicitation in regard to valence.
The automatic emotion recognition technology is an important part of building intelligent systems to prevent the computers acting inappropriately. A novel approach for recognizing emotional state by their keystroke typing patterns on a standard keyboard was developed in recent years. However, there was very limited investigation about the phenomenon itself in the previous literatures. Hence, in our study, we conduct a controlled experiment to collect subjects' keystroke data in the different emotional states induced by facial feedback. We examine the difference of the keystroke data between positive and negative emotional states. The results prove the significance in the differences in the typing patterns under positive and negative emotions for all subjects. Our study provides an evidence for the reasonability about developing the technique of emotion recognition by keystroke.
In this chapter we 1) provide a new dataset collected from real-world for researchers to examine possible influence of emotions on user authentication using keystroke dynamics biometrics, or develop their own systems to recognize emotions using keystroke 71 72 P.-M. Lee et al. dynamics patterns, 2) summarize recent findings in the field of emotion recognition using keystroke dynamics, and 3) provide concrete suggestions to the field of user authentication using keystroke dynamics biometrics based on the empirical findings derived from the proposed dataset. IntroductionEmotion plays an essential role in human life because it is the foundation of the motivational system of people. Recent studies in the field of psychology, brain science and communication technology have demonstrated the substantial effects of emotion on human cognition and behavior. Lang et al. (1995) reported an empirical study on the effects of emotional characteristics of stimuli on subjects' cognitive capacity and memory [6]. Bolls et al. (2001) revealed the effect that subjects tend to remember stimulus that elicit negative emotions more, than the stimulus that elicit positive emotions [3]. Later researchers focused on the use of emotional relevant stimulus on attracting the attention of subjects and to make subjects remember more on the presented stimuli. Theories and experimental results of the examinations on the connections between emotion and motivation were also reported [7].Automatic affect recognition technology has been proposed and has attracted considerable attention since its proposal. The proposed technology aims to help in the Human Computer Interaction (HCI) area. This is because a computer interactive application that does not understand or adapt to a user's context, such as their location, professional, or the emotion states of a user, may lead to usability problems. Such an application could provide annoying feedback, interrupt users in an inappropriate situation, or increase the user's frustration. The main purpose of affect recognition technology is to provide intelligent systems that can provide computer applications the information about the changes of human emotions. As a result, the applications equipped with such a technology can detect and respond to a users' emotion state, and could even give a better user experience as well as provide appropriate feedback in helping users in using the applications. Various methods have been proposed to detect emotions, including the use of facial expressions, voice intonation, and physiological signals collected from human body. All of these proposed methods have high classification rate of success. However, the drawback of these methods is that they can be intrusive to the user and the equipment may be difficult to obtain because of being too expensive.During the last decade, a novel approach for emotion recognition has been reported, which is by using keystroke dynamics. For keystroke dynamics the time of each key press and release on the keyboard data were collected and keystr...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.