2022
DOI: 10.1088/1741-2552/ac69bc
|View full text |Cite
|
Sign up to set email alerts
|

Neural sources of prediction errors detect unrealistic VR interactions

Abstract: Objective Neural interfaces hold significant promise to implicitly track user experience. Their application in VR/AR simulations is especially favorable as it allows user assessment without breaking the immersive experience. In VR, designing immersion is one key challenge. Subjective questionnaires are the established metrics to assess the effectiveness of immersive VR simulations. However, administering such questionnaires requires breaking the immersive experience they are supposed to assess. Approach We p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2
2

Relationship

1
5

Authors

Journals

citations
Cited by 21 publications
(22 citation statements)
references
References 58 publications
0
22
0
Order By: Relevance
“…The work of colleagues (2019, 2022) has already demonstrated that this is possible for a different type of ERP modulation: They introduced temporal mismatches between sensory stimuli received by the visual and haptic modalities, and showed that these mismatches are accompanied by prediction-errorrelated ERP components (Gehrke et al, 2019). In a follow-up study, they showed that this ERP effect can actually be used as input for a classifier that detects trials with visuo-haptic mismatches solely based on participants' EEG data (Gehrke et al, 2022). Their ERP-based classification reached an accuracy of 77%, whereas classification based on behavioral data barely exceeded chance level (55%).…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The work of colleagues (2019, 2022) has already demonstrated that this is possible for a different type of ERP modulation: They introduced temporal mismatches between sensory stimuli received by the visual and haptic modalities, and showed that these mismatches are accompanied by prediction-errorrelated ERP components (Gehrke et al, 2019). In a follow-up study, they showed that this ERP effect can actually be used as input for a classifier that detects trials with visuo-haptic mismatches solely based on participants' EEG data (Gehrke et al, 2022). Their ERP-based classification reached an accuracy of 77%, whereas classification based on behavioral data barely exceeded chance level (55%).…”
Section: Discussionmentioning
confidence: 99%
“…Because the assessment rests on a comparison of ERPs elicited by specific self-versus externally generated events precisely locked in time, the method is robust against unspecific variations in user state. Indeed, prediction-related ERPs have already been suggested as a quality metric for VR interactions (Gehrke et al, 2019(Gehrke et al, , 2022. These authors introduced temporal mismatches between different sensory channels -specifically, between visual and haptic stimuli elicited by a user action.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Motion events can be an integral analysis aspect of MoBI experiments. For example, they can be used to detect reaction time and movement duration (Gehrke et al, 2022; Jungnickel & Gramann, 2016), they can serve as anchors for time-warping in spectral analysis (Gramann et al, 2021), they can help remove oscillatory gait artifacts (Gwin et al, 2010), or they can help to shed light on the neural basis of oscillatory gait generation (Wagner et al, 2016). To enable this functionality, the pipeline contains two functions: A basic movement onset and offset detector that requires only a single tracked element of any kind, and the advanced detection of relevant gait event markers.…”
Section: Data Cleaning and Processingmentioning
confidence: 99%
“…The detector assumes no trend in the data and thus works on data where the endpoint of a movement is the same as the start point in the relevant channels. This can be for example the yaw orientation of the head to detect rotation movements (Gramann et al, 2021), the position of the hand in a reaching task (Gehrke et al, 2022), or the up/down movement of a foot tracker to detect steps (see next section). The detected event markers and used parameters are stored in the data structure so they can be copied between synchronized datasets of different modalities.…”
Section: Basic Motion Event Detectionmentioning
confidence: 99%