2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI) 2013
DOI: 10.1109/hri.2013.6483584
|View full text |Cite
|
Sign up to set email alerts
|

Robot-human hand-overs in non-anthropomorphic robots

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 2 publications
0
2
0
Order By: Relevance
“…By using a non-human-like object as control agent and by focusing on gaze behavior, our findings expand existing literature that demonstrated the importance of human-like and robot-like visual textures during the observation and prediction of humanoid robots mechanical manual actions ( Saygin et al., 2012 ; Saygin and Stadler, 2012 ; Urgen et al., 2018 ). Further work is now required to examine how implicit signals (e.g., gaze movements) of human-like and non-human-like robots ( Micelli et al., 2011 ; Pan et al., 2018 ; Sivakumar et al., 2013 ) facilitate the predictability of artificial agents to positively improve the fluency and subjective experience of human–robot interactions ( Ortenzi et al., 2021 ).…”
Section: Discussionmentioning
confidence: 99%
“…By using a non-human-like object as control agent and by focusing on gaze behavior, our findings expand existing literature that demonstrated the importance of human-like and robot-like visual textures during the observation and prediction of humanoid robots mechanical manual actions ( Saygin et al., 2012 ; Saygin and Stadler, 2012 ; Urgen et al., 2018 ). Further work is now required to examine how implicit signals (e.g., gaze movements) of human-like and non-human-like robots ( Micelli et al., 2011 ; Pan et al., 2018 ; Sivakumar et al., 2013 ) facilitate the predictability of artificial agents to positively improve the fluency and subjective experience of human–robot interactions ( Ortenzi et al., 2021 ).…”
Section: Discussionmentioning
confidence: 99%
“…Human-robot handovers offer a rich design space with a large number of parameters. Research to date has explored the role of gaze [21], approach angle and saliency [34], contrast between start and end points of handover motion [7], and anthropomorphism [31]. Moreover, prior work developed methods to choose handover parameters such as pose and trajectory that consider user preferences [6], user comfort [2], object affordances [3,9], and user mobility constraints [19] that facilitate human-robot handovers.…”
Section: B Designing Handovers For Hrimentioning
confidence: 99%