2021
DOI: 10.3389/frobt.2021.642796
|View full text |Cite
|
Sign up to set email alerts
|

Mind the Eyes: Artificial Agents’ Eye Movements Modulate Attentional Engagement and Anthropomorphic Attribution

Abstract: Artificial agents are on their way to interact with us daily. Thus, the design of embodied artificial agents that can easily cooperate with humans is crucial for their deployment in social scenarios. Endowing artificial agents with human-like behavior may boost individuals’ engagement during the interaction. We tested this hypothesis in two screen-based experiments. In the first one, we compared attentional engagement displayed by participants while they observed the same set of behaviors displayed by an avata… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 47 publications
1
7
0
Order By: Relevance
“…Perugia, Paetzel-Prüsmann, Alanenpää, and Castellano (2021) compared users' total fixation time toward human-like, mechanical, or morph robot faces with mutual gaze and found that users'total fixation time could predict engagement in voice conversations. Users' visual attention toward the human gaze versus the robot gaze with different parameters also has been explored by watching videos (Ghiglino et al, 2020(Ghiglino et al, , 2021.…”
Section: Robot Gazementioning
confidence: 99%
See 1 more Smart Citation
“…Perugia, Paetzel-Prüsmann, Alanenpää, and Castellano (2021) compared users' total fixation time toward human-like, mechanical, or morph robot faces with mutual gaze and found that users'total fixation time could predict engagement in voice conversations. Users' visual attention toward the human gaze versus the robot gaze with different parameters also has been explored by watching videos (Ghiglino et al, 2020(Ghiglino et al, , 2021.…”
Section: Robot Gazementioning
confidence: 99%
“…Additionally, these earlier studies chiefly measured users' subjective perception (Babel et al, 2021;Morillo-Mendez et al, 2021;Mutlu et al, 2012) or combined with either visual attention (Ghiglino et al, 2020(Ghiglino et al, , 2021Thepsoonthorn et al, 2021) or cerebral activity (Belkaid et al, 2021;Kompatsiari, Bossi, & Wykowska, 2021). Kelley et al (2021) measured eye movements with brain signals and reported the proportion of time spent looking at robot eyes and cerebral activity.…”
Section: Robot Gazementioning
confidence: 99%
“…before any task is given to participants). Furthermore, recently Ghiglino and colleagues showed that subtle differences in the robot behavior might influence the individual tendency to adopt the intentional stance (Ghiglino et al, 2020) and that including human-like behaviors in the robot can facilitate communication in human-robot interaction in interactive scenarios (Ghiglino et al, 2021).…”
Section: Introductionmentioning
confidence: 99%
“…This form factor allows clients to have more natural face-to-face conversations at any time, combining the features of human interaction with the benefits of agent interaction. Furthermore, the inclusion of human-like behaviors such as eye movements has the potential to boost users' communication with agents [170].…”
Section: Embodimentmentioning
confidence: 99%