2019 International Conference on Robotics and Automation (ICRA) 2019
DOI: 10.1109/icra.2019.8793858
|View full text |Cite
|
Sign up to set email alerts
|

Who Takes What: Using RGB-D Camera and Inertial Sensor for Unmanned Monitor

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 15 publications
0
1
0
Order By: Relevance
“…Reference [ 20 ] uses a depth camera to extract skeletons and pair them with the Inertial Measurement Unit (IMU) devices carried by users. Fusion-based human and object tracking is shown in [ 21 ]. In [ 22 ], they use a camera to capture human motion by OpenPose [ 3 ] and match the motion with an IMU device to address ID association.…”
Section: Related Workmentioning
confidence: 99%
“…Reference [ 20 ] uses a depth camera to extract skeletons and pair them with the Inertial Measurement Unit (IMU) devices carried by users. Fusion-based human and object tracking is shown in [ 21 ]. In [ 22 ], they use a camera to capture human motion by OpenPose [ 3 ] and match the motion with an IMU device to address ID association.…”
Section: Related Workmentioning
confidence: 99%