2021
DOI: 10.3389/fpsyg.2021.673982
|View full text |Cite
|
Sign up to set email alerts
|

Temporal Behavioral Parameters of On-Going Gaze Encounters in a Virtual Environment

Abstract: To navigate the social world, humans heavily rely on gaze for non-verbal communication as it conveys information in a highly dynamic and complex, yet concise manner: For instance, humans utilize gaze effortlessly to direct and infer the attention of a possible interaction partner. Many traditional paradigms in social gaze research though rely on static ways of assessing gaze interaction, e.g., by using images or prerecorded videos as stimulus material. Emerging gaze contingent paradigms, in which algorithmical… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2
2

Relationship

3
3

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 68 publications
(88 reference statements)
0
8
0
Order By: Relevance
“…Its validity, however, depends on the expertise and experience of the observer (Kamp‐Becker et al., 2018). Future developments should incorporate quantifiable indices of behavior, for example, eye gaze (Chong et al., 2019; Hartz, Guth, Jording, Vogeley, & Schulte‐Rüther, 2021), facial expression (Drimalla et al., 2020), motion (Budman et al., 2019), or even neural assessment during social interaction (Kruppa et al., 2020). A holistic approach is necessary (Roessner et al., 2021), including intermediate steps: Future studies could aim to set up generative models that describe how quantifiable behavioral indices translate into clinician symptom ratings, and in a second step, relate these to the diagnostic classification.…”
Section: Discussionmentioning
confidence: 99%
“…Its validity, however, depends on the expertise and experience of the observer (Kamp‐Becker et al., 2018). Future developments should incorporate quantifiable indices of behavior, for example, eye gaze (Chong et al., 2019; Hartz, Guth, Jording, Vogeley, & Schulte‐Rüther, 2021), facial expression (Drimalla et al., 2020), motion (Budman et al., 2019), or even neural assessment during social interaction (Kruppa et al., 2020). A holistic approach is necessary (Roessner et al., 2021), including intermediate steps: Future studies could aim to set up generative models that describe how quantifiable behavioral indices translate into clinician symptom ratings, and in a second step, relate these to the diagnostic classification.…”
Section: Discussionmentioning
confidence: 99%
“…On each block, the face of a human avatar was continuously displayed on the screen. We used DAZ Studio 4.9 (DAZ Productions, Inc., USA) to create a modified version of the stimuli validated by Hartz et al (2021). Eleven pictures of the avatar with different levels of smiling were created for the visualization of the feedback signal.…”
Section: Methodsmentioning
confidence: 99%
“…On each block, the face of a human avatar was continuously displayed on the screen. We used DAZ Studio 4.9 (DAZ Productions, Inc., USA) to create a modified version of the stimuli validated by Hartz et al (2021).…”
Section: Neurofeedback Trainingmentioning
confidence: 99%
“…More quantifiable indices of behavior derived from an ADOS-like examination would be desirable to allow for a more precise and observerindependent assessment. Future developments should incorporate quantifiable assessments of eye gaze (Chong et al, 2019;Hartz et al, 2021), facial expression (Drimalla et al, 2020), motion (Budman et al, 2019), or even neural assessment during social interaction (Kruppa et al, 2020). However, it is unlikely that machine learning approaches will succeed in directly relating the basic building blocks of interactive social behavior to a clinical diagnosis of a heterogeneous disorder such as ASD.…”
Section: Future Directionsmentioning
confidence: 99%