2021 IEEE International Conference on Robotics and Automation (ICRA) 2021
DOI: 10.1109/icra48506.2021.9560755
|View full text |Cite
|
Sign up to set email alerts
|

Automating Behavior Selection for Affective Telepresence Robot

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…-Body pose: skeleton detection [16] is used to obtain the poses of the different persons' parts (limbs, body, head) in the field of view. -Gestures: Roughly 50 bodily movements can be recognized such as waving, pointing, teasing, scratching one's head, asking Haru to be quiet, jumping, laughing, face covering, bowing, clapping, walking, standing, sitting down, standing up, among others [21]. -Face features: Face in the database can be identified.…”
Section: Perceptionmentioning
confidence: 99%
“…-Body pose: skeleton detection [16] is used to obtain the poses of the different persons' parts (limbs, body, head) in the field of view. -Gestures: Roughly 50 bodily movements can be recognized such as waving, pointing, teasing, scratching one's head, asking Haru to be quiet, jumping, laughing, face covering, bowing, clapping, walking, standing, sitting down, standing up, among others [21]. -Face features: Face in the database can be identified.…”
Section: Perceptionmentioning
confidence: 99%