Proceedings of the International Conference on Artificial Intelligence and Robotics and the International Conference on Automat 2016
DOI: 10.1145/2952744.2952748
|View full text |Cite
|
Sign up to set email alerts
|

A system for an anticipative front human following robot

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 8 publications
0
10
0
Order By: Relevance
“…In particular, various aspects such as desired proximity behaviors, following angles (Shanee et al, 2016), turning and waiting behavior, etc., should be considered during trajectory planning. Additionally, application-specific choices, such as whether to stay behind or side-by-side while following, the desired speed, and relevant anticipative behaviors (Granata and Bidaud, 2012; Mi et al, 2016) are essential considerations. Another important feature is to maintain the expected behavior during explicit interactions (Hu et al, 2014; Islam et al, 2018c), e.g., being stationary when the human is communicating, and exhibiting correct acknowledgement responses.…”
Section: State-of-the-art Approachesmentioning
confidence: 99%
See 3 more Smart Citations
“…In particular, various aspects such as desired proximity behaviors, following angles (Shanee et al, 2016), turning and waiting behavior, etc., should be considered during trajectory planning. Additionally, application-specific choices, such as whether to stay behind or side-by-side while following, the desired speed, and relevant anticipative behaviors (Granata and Bidaud, 2012; Mi et al, 2016) are essential considerations. Another important feature is to maintain the expected behavior during explicit interactions (Hu et al, 2014; Islam et al, 2018c), e.g., being stationary when the human is communicating, and exhibiting correct acknowledgement responses.…”
Section: State-of-the-art Approachesmentioning
confidence: 99%
“…Unlike in an underwater scenario, the person cannot come close to the robot and perform hand gestures in front of its camera. Consequently, the UAV might end up being too far away to detect various kinds of hand gestures (Bruce et al, 2016; Monajjemi et al, 2016). In such cases, it is often useful to use a reliable gesture (a static palm gesture, waving hands, etc.)…”
Section: State-of-the-art Approachesmentioning
confidence: 99%
See 2 more Smart Citations
“…Several works [15][16][17][18][19][20] use the first approach to approximate the distance information by triangulation methods applied on two or more RGB views of the same scene. However, the most used visual sensors for person detection are RGB-D cameras [21][22][23][24][25][26][27][28][29][30][31][32] that are able to get both RGB images and depth maps by exploiting infrared light. Several methods employ sensor fusion techniques to merge information from different kinds of sensing systems.…”
Section: Person Followingmentioning
confidence: 99%