2019 IEEE 2nd International Conference on Information Communication and Signal Processing (ICICSP) 2019
DOI: 10.1109/icicsp48821.2019.8958548
|View full text |Cite
|
Sign up to set email alerts
|

SET: Stereo Evaluation Toolbox for Combined Performance Assessment of Camera Systems, 3D Reconstruction and Visual SLAM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…Most recent sensor benchmarks evaluated low cost consumer sensors and/or LiDARs, such as an actuated SICK LMS-200 (ground truth), a SwissRanger SR-4000, a Fotonic B70 and a Microsoft Kinect camera [8], a Hokuyo URG-04LX, Hukuyo UTM-30LX and Sick LMS-151 [9], a Microsoft Kinect, ZESS MultiCam 3k and 19k, SoftKinetic DepthSense 311, PMD Technologies 3k-S and Camcube 41k [10], Kinect and Asus Xtion [13], Kinect II [16], [17], Microsoft Kinect II and Asus Xtion Pro [14], Intel Realsense SR300 [11], Intel Realsense D415 [12] and a comparison of ten different LiDARs for applications in Autonomouts driving [18]. One exception is the SET framework for the evaluation of stereo matching used to test a Stereolabs ZED and a Roboception rc_visard 160 [19]. Unfortunately, it is not applicable in our benchmark, as most of the included sensors obtain 3D information via structured light (SL) or time of flight (ToF).…”
Section: Related Workmentioning
confidence: 99%
“…Most recent sensor benchmarks evaluated low cost consumer sensors and/or LiDARs, such as an actuated SICK LMS-200 (ground truth), a SwissRanger SR-4000, a Fotonic B70 and a Microsoft Kinect camera [8], a Hokuyo URG-04LX, Hukuyo UTM-30LX and Sick LMS-151 [9], a Microsoft Kinect, ZESS MultiCam 3k and 19k, SoftKinetic DepthSense 311, PMD Technologies 3k-S and Camcube 41k [10], Kinect and Asus Xtion [13], Kinect II [16], [17], Microsoft Kinect II and Asus Xtion Pro [14], Intel Realsense SR300 [11], Intel Realsense D415 [12] and a comparison of ten different LiDARs for applications in Autonomouts driving [18]. One exception is the SET framework for the evaluation of stereo matching used to test a Stereolabs ZED and a Roboception rc_visard 160 [19]. Unfortunately, it is not applicable in our benchmark, as most of the included sensors obtain 3D information via structured light (SL) or time of flight (ToF).…”
Section: Related Workmentioning
confidence: 99%
“…Also, a qualitative visual assessment of label smoothness is possible. A scoring from 0 to 10 allows a detailed rating for experts with domain knowledge [12]. Here, 10 stands for the highest ACC possible.…”
Section: Accuracy (Acc) Of Processed Datamentioning
confidence: 99%
“…On the other hand, Halmetschlager-Funek et al [12] analyzed the precision, bias and lateral noise of sensors on different lightning conditions when observing objects with different materials. Heide et al [13] evaluated stereo cameras, using as metrics the point cloud density, the smoothness of the surface points captured from the walls, the consistency of the edges, and the mean distance between the 3D points and the ground truth surfaces along with a comparison of their surface normals. For simultaneous localization and mapping (SLAM) use cases, Neto et al [14] relied on the mean distance between the poses estimated by the SLAM algorithm and the ground truth poses, which were known because the sensors were mounted on a robotic arm for following a pre-programmed trajectory.…”
Section: Introductionmentioning
confidence: 99%