2011
DOI: 10.1007/978-3-642-23678-5_52
|View full text |Cite
|
Sign up to set email alerts
|

Accurate and Practical Calibration of a Depth and Color Camera Pair

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
70
0
2

Year Published

2012
2012
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 102 publications
(72 citation statements)
references
References 5 publications
0
70
0
2
Order By: Relevance
“…The Kinect and projector use different coordinate systems, and thus, the calibration between them should be performed when projecting a texture image onto a curved display generated by actuators [21,22]. Let X Y Z 1 T be the homogeneous coordinates of p measured by the Kinect depth (infrared) camera and wx wy w T be the perspective coordinates of the projection point p .…”
Section: Calibration Of the Kinect With The Projectormentioning
confidence: 99%
“…The Kinect and projector use different coordinate systems, and thus, the calibration between them should be performed when projecting a texture image onto a curved display generated by actuators [21,22]. Let X Y Z 1 T be the homogeneous coordinates of p measured by the Kinect depth (infrared) camera and wx wy w T be the perspective coordinates of the projection point p .…”
Section: Calibration Of the Kinect With The Projectormentioning
confidence: 99%
“…[4][5][6] The calibration of IR and color camera is based on Zhang's method, which is implemented by the Camera Calibration Toolbox in MATLAB. 7 In this method multiple images are taken from different positions from a known sized chessboard pattern.…”
Section: The Calibration Processmentioning
confidence: 99%
“…Also, manual correspondences need to be specified to improve calibration accuracy. Second is Herrera's method [20], in which authors claim to achieve features such as accuracy, practicality, and applicability. The method requires planar surface to be imaged from various poses and presents a new depth distortion model for the depth sensor.…”
Section: State Of the Artmentioning
confidence: 99%