2023
DOI: 10.1075/is.22010.sca
|View full text |Cite
|
Sign up to set email alerts
|

From vocal prosody to movement prosody, from HRI to understanding humans

Philip Scales,
Véronique Aubergé,
Olivier Aycard

Abstract: Human–Human and Human–Robot Interaction are known to be influenced by a variety of modalities and parameters. Nevertheless, it remains a challenge to anticipate how a given mobile robot’s navigation and appearance will impact how it is perceived by humans. Drawing a parallel with vocal prosody, we introduce the notion of movement prosody, which encompasses spatio-temporal and appearance dimensions which are involved in a person’s perceptual experience of interacting with a mobile … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 56 publications
0
5
0
Order By: Relevance
“…Our model of human perception was derived from the analysis of experimental data from our online study and in-person studies presented in our prior work [ 30 ]. The results show that accelerations, velocities, and timing have significant impacts on the social perception of our mobile robot.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…Our model of human perception was derived from the analysis of experimental data from our online study and in-person studies presented in our prior work [ 30 ]. The results show that accelerations, velocities, and timing have significant impacts on the social perception of our mobile robot.…”
Section: Discussionmentioning
confidence: 99%
“…The scales were chosen based on words that participants in prior HRI studies had used to self-annotate their own recorded interaction data after a long experiment with a small butler robot [26,27]. For more details about the choice of the adjectives in our scales, we refer the reader to [30]. During the experiment, each participant rated 45 different videos each showing a unique combination of the corpus variable values along the 10 binary perceptual scales.…”
Section: Perceptual Scalesmentioning
confidence: 99%
See 3 more Smart Citations