2018
DOI: 10.1007/978-3-030-01252-6_36
|View full text |Cite
|
Sign up to set email alerts
|

A Framework for Evaluating 6-DOF Object Trackers

Abstract: We present a challenging and realistic novel dataset for evaluating 6-DOF object tracking algorithms. Existing datasets show serious limitations-notably, unrealistic synthetic data, or real data with large fiducial markers-preventing the community from obtaining an accurate picture of the state-of-the-art. Using a data acquisition pipeline based on a commercial motion capture system for acquiring accurate ground truth poses of real objects with respect to a Kinect V2 camera, we build a dataset which contains a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
75
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
3
1

Relationship

1
9

Authors

Journals

citations
Cited by 29 publications
(76 citation statements)
references
References 23 publications
1
75
0
Order By: Relevance
“…, 1000. In contrast to the previous experiment, here we use a different, more generic criterion to measure the tracking error as suggested in [10], since (39) highly depends on the object's geometry. Starting at T j (t 0 ) = T j gt (t 0 ), for each subsequent frame we thus compute the tracking error separately for translation e j k (t) = t j (t k ) − t j gt (t k ) 2 and rotation e j k (R) = cos −1 trace(R j (t k ) R j…”
Section: Resultsmentioning
confidence: 99%
“…, 1000. In contrast to the previous experiment, here we use a different, more generic criterion to measure the tracking error as suggested in [10], since (39) highly depends on the object's geometry. Starting at T j (t 0 ) = T j gt (t 0 ), for each subsequent frame we thus compute the tracking error separately for translation e j k (t) = t j (t k ) − t j gt (t k ) 2 and rotation e j k (R) = cos −1 trace(R j (t k ) R j…”
Section: Resultsmentioning
confidence: 99%
“…Approaches just using synthetical training data, like Tan et al [60] or Garon et al [61], show remarkable accuracies in their field. However, they rely on RGB-D images providing depth information, or use model data during runtime.…”
Section: Approaches Using Only Synthetic Datamentioning
confidence: 99%
“…A motion capturing system by Vicon (VMCS), V16 (Vicon Motion Systems, LA, USA), with eight motion capturing cameras mounted on the ceiling was used to monitor the position and movement of the 6DTC and the robot couch, with a set of six reflective position/motion markers (spherical, 14 mm in diameter) attached to the 6DTC or to the robot couch. After calibration, the motion capturing system is capable of detecting deformation and displacement with accuracy better than 0.1 • and 0.1 mm (15)(16)(17).…”
Section: Coordinate System and Position/motion Monitoring And Measurementioning
confidence: 99%