A stress detection system is developed based on the physiological signals monitored by non-invasive and non-intrusive sensors. The development of this emotion recognition system involved three stages: experiment setup for physiological sensing, signal preprocessing for the extraction of affective features and affective recognition using a learning system. Four signals: galvanic skin response (GSR), blood volume pulse (BVP), pupil diameter (PD) and skin temperature (ST) are monitored and analyzed to differentiate affective states in a computer user. A support vector machine is used to perform the supervised classification of affective states between "stress" and "relaxed". Results indicate that the physiological signals monitored do, in fact, have a strong correlation with the changes in emotional state of our experimental subjects when stress stimuli are applied to the interaction environment. It was also found that the pupil diameter was the most significant affective state indicator, compared to the other three physiological signals monitored.
This study introduces the design of an integrated assistive real-time system developed as an alternate input device to computers that can be used by individuals with severe motor disabilities. An assistive technology device as defined by the Assistive Technology Act of 1998 is "any item, piece of equipment, or product system, whether acquired commercially, modified, or customized, that is used to increase, maintain, or improve the functional capabilities of individuals with disabilities". The proposed real-time system design utilizes electromyographic (EMG) biosignals from cranial muscles and electroencephalographic (EEG) biosignals from the cerebrum's occipital lobe, which are transformed into controls for two-dimensional (2-D) cursor movement, the Left-Click (Enter) command, and an ON/OFF switch for the cursor-control functions. This HCI system classifies biosignals into "mouse" functions by applying amplitude thresholds and performing power spectral density (PSD) estimations on discrete windows of data, Spectral power summations are aggregated over several frequency bands between 8 and 500 Hz and then compared to produce the correct classification. The result is an affordable DSP-based system that, when combined with an on-screen keyboard, enables the user to fully operate a computer without using any extremities.
To study the neural networks reorganization in pediatric epilepsy, a consortium of imaging centers was established to collect functional imaging data. Common paradigms and similar acquisition parameters were used. We studied 122 children (64 control and 58 LRE patients) across five sites using EPI BOLD fMRI and an auditory description decision task. After normalization to the MNI atlas, activation maps generated by FSL were separated into three sub-groups using a distance method in the principal component analysis (PCA)-based decisional space. Three activation patterns were identified: (1) the typical distributed network expected for task in left inferior frontal gyrus (Broca’s) and along left superior temporal gyrus (Wernicke’s) (60 controls, 35 patients); (2) a variant left dominant pattern with greater activation in IFG, mesial left frontal lobe, and right cerebellum (three controls, 15 patients); and (3) activation in the right counterparts of the first pattern in Broca’s area (one control, eight patients). Patients were over represented in Groups 2 and 3 (P < 0.0004). There were no scanner (P = 0.4) or site effects (P = 0.6). Our data-driven method for fMRI activation pattern separation is independent of a priori notions and bias inherent in region of interest and visual analyses. In addition to the anticipated atypical right dominant activation pattern, a sub-pattern was identified that involved intensity and extent differences of activation within the distributed left hemisphere language processing network. These findings suggest a different, perhaps less efficient, cognitive strategy for LRE group to perform the task.
Affective Computing, one of the frontiers of Human-Computer Interaction studies, seeks to provide computers with the capability to react appropriately to a user's affective states. In order to achieve the required on-line assessment of those affective states, we propose to extract features from physiological signals from the user (Blood Volume Pulse, Galvanic Skin Response, Skin Temperature and Pupil Diameter), which can be processed by learning pattern recognition systems to classify the user's affective state. An initial implementation of our proposed system was set up to address the detection of "stress" states in a computer user. A computer-based "Paced Stroop Test" was designed to act as a stimulus to elicit emotional stress in the subject. Signal processing techniques were applied to the physiological signals monitored to extract features used by three learning algorithms: Naïve Bayes, Decision Tree and Support Vector Machine to classify relaxed vs. stressed states.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.