Determining the relative ubiety between the camera and the structured light plane projector is a classical problem in the measurement of line-structured light vision sensors. A geometrical calibration method based on the theory of vanishing points and vanishing lines is proposed. In this method, a planar target with several parallel lines is used. By moving the target to at least two different positions randomly, we can obtain the normal vector of the structured light plane under the camera coordinate system; as the distance of each two adjacent parallel lines has been known exactly, the parameter D of the structured light plane is determined; therefore, the function of the structured light plane can be confirmed. Experimental results show that the accuracy of the proposed calibration method can reach 0.09 mm within the view field of about 200 × 200 mm. Moreover, the target used in our calibration method can be easily produced precisely, and the calibration method is efficient and convenient as its simple calculation and easy operation, especially for onsite calibration.
Background: Determining the relative ubiety between the camera and the laser projector in a line-structured light vision sensor is a classical yet important task. Typical calibration methods often confront problems, such as difficulty of producing the target precisely and introduction of perspective projection errors. Methods: In this work, a new calibration method based on a concentric circle feature is introduced. The proposed method is based on geometrical properties and can reduce the perspective projection error. In our method, the vanishing line of the light plane is firstly deduced from the imaged concentric circles. Then the normal vector of the light plane is determined. Consequently, the complete expression can be confirmed from the principle of the intersecting planes. Results and conclusion: The proposed method is simple and robustness as the basic theory is geometrical properties. Accuracy evaluation experiment shows that the accuracy of the calibration method can reach 0.07 mm within the view field of about 200 × 200 mm. This accuracy is comparable to the commonly used calibration method with a checkerboard planar target, whereas our target is simple to produce.
Laser-tracking measurement systems (laser trackers) based on a vision-guiding device are widely used in industrial fields, and their calibration is important. As conventional methods typically have many disadvantages, such as difficult machining of the target and overdependence on the retroreflector, a novel calibration method is presented in this paper. The retroreflector, which is necessary in the normal calibration method, is unnecessary in our approach. As the laser beam is linear, points on the beam can be obtained with the help of a normal planar target. In this way, we can determine the function of a laser beam under the camera coordinate system, while its corresponding function under the laser-tracker coordinate system can be obtained from the encoder of the laser tracker. Clearly, when several groups of functions are confirmed, the rotation matrix can be solved from the direction vectors of the laser beams in different coordinate systems. As the intersection of the laser beams is the origin of the laser-tracker coordinate system, the translation matrix can also be determined. Our proposed method not only achieves the calibration of a single laser-tracking measurement system but also provides a reference for the calibration of a multistation system. Simulations to evaluate the effects of some critical factors were conducted. These simulations show the robustness and accuracy of our method. In real experiments, the root mean square error of the calibration result reached 1.46 mm within a range of 10 m, even though the vision-guiding device focuses on a point approximately 5 m away from the origin of its coordinate system, with a field of view of approximately 200 mm × 200 mm.
In this paper, a new method to calibrate a trinocular vision sensor is presented. A planar target with several parallel lines is utilized. The trifocal tensor of three image planes can be calculated out according to line correspondences. Compatible essential matrix between each two cameras can be obtained. Then, rotation matrix and translation matrix can be deduced base on singular value decomposition of their corresponding essential matrix. In our proposed calibration method, image rectification is carried out to remove perspective distortion. As the feature utilized is straight line, precise point to point correspondence is not necessary. Experimental results show that our proposed calibration method can obtain precise results. Moreover, the trifocal tensor can also give a strict constraint for feature matching as descripted in our previous work. Root mean square error of measured distances is 0.029 mm with regards to the view field of about 250×250 mm. As parallel feature exists widely in natural scene, our calibration method also provides a new approach for self-calibration of a trinocular vision sensor.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.