2021
DOI: 10.31234/osf.io/te8rb
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Belief in sharing the same phenomenological experience increases the likelihood of adopting the intentional stance towards a humanoid robot

Abstract: Humans interpret and predict others’ behaviors by ascribing them intentions or beliefs, or in other words, by adopting the intentional stance. Since artificial agents are increasingly populating our daily environments, the question arises whether (and under which conditions) humans would apply the “human-model” to understand the behaviors of these new social agents. Thus, in a series of three experiments we tested whether embedding humans in a social interaction with a humanoid robot either displaying a human-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1

Relationship

3
3

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 43 publications
(51 reference statements)
1
5
0
Order By: Relevance
“…These results extend evidence that sharing a task with a robotic partner can influence the way we perceive it, by changing our likelihood of adopting the intentional stance toward it [20] [21] [22]. Importantly, while in our task participants and the robot had a shared goal (detection of the target) there was no reward (for example earning points, as in [21]) associated with achieving the goal.…”
Section: Discussionsupporting
confidence: 76%
See 2 more Smart Citations
“…These results extend evidence that sharing a task with a robotic partner can influence the way we perceive it, by changing our likelihood of adopting the intentional stance toward it [20] [21] [22]. Importantly, while in our task participants and the robot had a shared goal (detection of the target) there was no reward (for example earning points, as in [21]) associated with achieving the goal.…”
Section: Discussionsupporting
confidence: 76%
“…Previous studies on task sharing with artificial agents show that the perceived intentionality of an artificial co-agent can influence the extent to which joint representations are formed with the co-agent during the interaction [10] [5] [6], which successively may affect performance in the task. Furthermore, previous results have also shown that the likelihood of attributing intentionality to a humanoid robot can be influenced through interaction with the robot [21] [20] [22].…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Concerning the study at hand, it was a challenge to account for various individual differences due to the limited sample size as individual difference study require large sample sizes [17]. It is also equally important to mention that we cannot state that this effect can be generalized to other settings due to the fact that robot features appearance [18], [47], robot behavior [15], [17], social/nonsocial context [10], [18] all influence how we engage with a robot. Thus, the interaction between these variables needs to be kept in mind when designing these robots.…”
Section: Discussionmentioning
confidence: 99%
“…Robots, on the other hand, were evaluated with low scores on the Body and Heart subscales, but higher scores on Mind. More importantly for social robotics, researchers suggest that this baseline perception of mind/mental capacities/intentionality can be altered by manipulating three categories of factors including (a) robot features [15]- [17] (b) the context of the HRI [10], [18] and (c) individual differences [9], [12], [13].…”
Section: Introductionmentioning
confidence: 99%