2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) 2020
DOI: 10.1109/ro-man47096.2020.9223542
|View full text |Cite
|
Sign up to set email alerts
|

Multi-camera Torso Pose Estimation using Graph Neural Networks

Abstract: Estimating the location and orientation of humans is an essential skill for service and assistive robots. To achieve a reliable estimation in a wide area such as an apartment, multiple RGBD cameras are frequently used. Firstly, these setups are relatively expensive. Secondly, they seldom perform an effective data fusion using the multiple camera sources at an early stage of the processing pipeline. Occlusions and partial views make this second point very relevant in these scenarios. The proposal presented in t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 21 publications
0
1
0
Order By: Relevance
“…Based on the assumption that in real life scenarios we can build on top of third party body trackers (e.g., [31,34]) and path planning systems, we proceed with the evaluation of the approach against the dataset presented in Section 3. Because all nodes are connected to their corresponding room node, the GNNs were trained to perform backpropagation based on the feature vector of the room node in the last layer.…”
Section: Resultsmentioning
confidence: 99%
“…Based on the assumption that in real life scenarios we can build on top of third party body trackers (e.g., [31,34]) and path planning systems, we proceed with the evaluation of the approach against the dataset presented in Section 3. Because all nodes are connected to their corresponding room node, the GNNs were trained to perform backpropagation based on the feature vector of the room node in the last layer.…”
Section: Resultsmentioning
confidence: 99%