2020
DOI: 10.1007/s12652-020-01845-y
|View full text |Cite
|
Sign up to set email alerts
|

Stress generation and non-intrusive measurement in virtual environments using eye tracking

Abstract: In real life, it is well understood how stress can be induced and how it is measured. While virtual reality (VR) applications can resemble such stress inducers, it is still an open question if and how stress can be measured in a non-intrusive way during VR exposure. Usually, the quality of VR applications is estimated by user acceptance in the form of presence. Presence itself describes the individual's acceptance of a virtual environment as real and is measured by specific questionnaires. Accordingly, it is e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1
1

Relationship

2
6

Authors

Journals

citations
Cited by 35 publications
(17 citation statements)
references
References 19 publications
0
17
0
Order By: Relevance
“…Modal Feature [33,34] Eyes activity Pupil Dilation Gaze Movement Blinking rate [35] Key stroke Key stroke pattern and dilation [36] Interaction on social media Detect in user post on social media [37][38][39] Speech signal Pitch Jitter Energy Speaking rate Length of pauses [40][41][42] Facial expression Local binary patterns (LBP-TOP) 3D histogram of oriented gradients (3DHOG) Weighted random forest (WRF) 3D scale-invariant feature transform (3DSIFT) (Continued) Many studies have explored the possibility of stress detection with facial skin temperature [45][46][47]. Researchers also investigated other modalities such as pupil dilation, breathing pattern, behavior pattern, keystroke pattern, and social media activity.…”
Section: Authormentioning
confidence: 99%
“…Modal Feature [33,34] Eyes activity Pupil Dilation Gaze Movement Blinking rate [35] Key stroke Key stroke pattern and dilation [36] Interaction on social media Detect in user post on social media [37][38][39] Speech signal Pitch Jitter Energy Speaking rate Length of pauses [40][41][42] Facial expression Local binary patterns (LBP-TOP) 3D histogram of oriented gradients (3DHOG) Weighted random forest (WRF) 3D scale-invariant feature transform (3DSIFT) (Continued) Many studies have explored the possibility of stress detection with facial skin temperature [45][46][47]. Researchers also investigated other modalities such as pupil dilation, breathing pattern, behavior pattern, keystroke pattern, and social media activity.…”
Section: Authormentioning
confidence: 99%
“…3), and they can be gathered on a by-task basis to visualize trends and enable day-to-day performance comparison. The fatigue and stress recognition algorithms are implemented based on the stateof-art methods published in [17][18][19][20], respectively.…”
Section: System Interface For Stress and Fatigue Monitoringmentioning
confidence: 99%
“…They can be generally classified into three categories: a) Use external body signals for emotion recognition, including facial expression, body gestures, gait, eyetracking, etc. These signals can be easily noticed by others but not always reflect one's real emotional states [15][16][17][18][19][20][21][22][23]. b) Use internal physiological signals such as heart rate, sphygmic, skin conductance, blood pressure, Electroencephalography (EEG), etc.…”
Section: Introductionmentioning
confidence: 99%