2015 3rd RSI International Conference on Robotics and Mechatronics (ICROM) 2015
DOI: 10.1109/icrom.2015.7367878
|View full text |Cite
|
Sign up to set email alerts
|

Skeleton and visual tracking fusion for human following task of service robots

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…This work properly employs the NiTE SDK to ascertain the location of the hand center point. 39 Figure 7 shows the estimated hand center point (i.e. the red point depicted in Figure 7) by the NiTE SDK.…”
Section: The Developed Smart Material-handling Robot System With Hri ...mentioning
confidence: 99%
See 2 more Smart Citations
“…This work properly employs the NiTE SDK to ascertain the location of the hand center point. 39 Figure 7 shows the estimated hand center point (i.e. the red point depicted in Figure 7) by the NiTE SDK.…”
Section: The Developed Smart Material-handling Robot System With Hri ...mentioning
confidence: 99%
“…To quickly find the hand point (if existing) location in the binary-valued image by the NiTE SDK, an initial procedure is implemented to obtain regular motion vector information between consecutive images (similar to eye detection using variants of motion vectors). In the NiTE SDK, a total of three dynamic hand gesture actions are defined to make such motion vector variants, 39 namely “hand-waving,”“hand-clicking,” and “hand-raising.” In this work, the gesture action of “hand-clicking,” representing “pushing forward and then retracting,” is used. In addition, when the hand point is well-detected, hand tracking then follows, in which keeping the hand “pushing forward” and performing the specific HRI hand gesture action will then be helpful for segmentations of the significant hand part.…”
Section: The Developed Smart Material-handling Robot System With Hri ...mentioning
confidence: 99%
See 1 more Smart Citation
“…[21] combined an omnidirectional camera and a laser to obtain robust tracking in an outdoor environment. In recent years, combined RGB image and depth sensing has enabled skeleton tracking, which can be used in combination with other visual detectors and filters for human tracking [4]. In the case of occlusion, an Extended Kalman Filter (EKF) based method has been used Ref.…”
Section: Person-following Robotmentioning
confidence: 99%