2012
DOI: 10.1109/jstsp.2012.2207090
|View full text |Cite
|
Sign up to set email alerts
|

Real-Time Distance-Dependent Mapping for a Hybrid ToF Multi-Camera Rig

Abstract: Abstract-We propose a real-time mapping procedure for data matching to deal with hybrid ToF multi-camera rig data fusion. Our approach takes advantage of the depth information provided by the ToF camera to calculate the distance-dependent disparity between the two cameras that constitute the system. As a consequence, the not co-centric binocular system behaves as a co-centric system with co-linear optical axes between their sensors. The association between mapped and non-mapped image coordinates can be describ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
7

Relationship

3
4

Authors

Journals

citations
Cited by 15 publications
(5 citation statements)
references
References 21 publications
0
5
0
Order By: Relevance
“…In addition to be a consumer‐accessible depth camera, a major advantages is that the aforementioned depth cameras are a close sensing system that provide real‐time depth and video streams with a well done mapping between the integrated sensors. Therefore, we do not have to deal with the internal transformation, in which distance‐dependent disparity is involve [16], to map the depth data onto the 2D data. In the following, we first motivate the use of a colour guidance image by a quantitative evaluation in which we have considered scenes from the Middlebury stereo dataset [Middlebury Stereo Dataset, http://vision.middlebury.edu/stereo] as well as the two test cases that we have previously generated to illustrate the downside of transforming a colour guidance image to its greyscale version.…”
Section: Resultsmentioning
confidence: 99%
“…In addition to be a consumer‐accessible depth camera, a major advantages is that the aforementioned depth cameras are a close sensing system that provide real‐time depth and video streams with a well done mapping between the integrated sensors. Therefore, we do not have to deal with the internal transformation, in which distance‐dependent disparity is involve [16], to map the depth data onto the 2D data. In the following, we first motivate the use of a colour guidance image by a quantitative evaluation in which we have considered scenes from the Middlebury stereo dataset [Middlebury Stereo Dataset, http://vision.middlebury.edu/stereo] as well as the two test cases that we have previously generated to illustrate the downside of transforming a colour guidance image to its greyscale version.…”
Section: Resultsmentioning
confidence: 99%
“…It is based on the assumption that the output has an edge only if the guidance image has one. Garcia et al 14 reduced the weight of unreliable depth samples by analyzing the gradient of LR depth values. Min et al 15 proposed a weighted mode filtering method (WMF) based on joint histogram of the depth candidates, which enforces the result to satisfy the requirement of L1 norm minimization.…”
Section: Filter-based Methodsmentioning
confidence: 99%
“…Note that this so-called backward warping is far from a trivial task due to the need of determining the distance-dependent disparity between each 2-D and depth pixel correspondences. Herein, we use the real-time backward warping algorithm proposed in [17], an iterative approach that addresses backward warping mapping with an accuracy of half the high-resolution 2-D pixel.…”
Section: Background and Related Workmentioning
confidence: 99%
“…Thereby and in the following depth enhancement techniques by fusion, the low-resolution depth map D has to be upsampled to have the same spatial resolution as I. To this end, nearest-neighbour interpolation might be used [25,48] since alternative data interpolation approaches such as bilinear or bicubic interpolation might introduce non-valid depth measurements among the upsampled edges of the depth map [17]. As in (1), the resulting depth map J JBU is an enhanced version of D that has been upsampled to the 2-D guidance image resolution.…”
Section: Joint Bilateral Upsampling (Jbu)mentioning
confidence: 99%
See 1 more Smart Citation