In the medical field, guidance to follow the surgical plan is crucial. Image overlay projection is a solution to link the surgical plan with the patient. It realizes augmented reality (AR) by projecting computer-generated image on the surface of the target through a projector, which can visualize additional information to the scene. By overlaying anatomical information or surgical plans on the surgery area, projection helps to enhance the surgeon’s understanding of the anatomical structure, and intuitively visualizes the surgical target and key structures of the operation, and avoid the surgeon’s sight diversion between monitor and patient. However, it still remains a challenge to project the surgical navigation information on the target precisely and efficiently. In this study, we propose a projector-based surgical navigation system. Through the gray code-based calibration method, the projector can be calibrated with a camera and then be integrated with an optical spatial locator, so that the navigation information of the operation can be accurately projected onto the target area. We validated the projection accuracy of the system through back projection, with average projection error of 3.37 pixels in x direction and 1.51 pixels in y direction, and model projection with an average position error of 1.03 ± 0.43 mm, and carried out puncture experiments using the system with correct rate of 99%, and qualitatively analyzed the system’s performance through the questionnaire. The results demonstrate the efficacy of our proposed AR system.
Pedicle screw insertion with robot assistance dramatically improves surgical accuracy and safety when compared with manual implantation. In developing such a system, hand–eye calibration is an essential component that aims to determine the transformation between a position tracking and robot-arm systems. In this paper, we propose an effective hand–eye calibration method, namely registration-based hand–eye calibration (RHC), which estimates the calibration transformation via point set registration without the need to solve the AX=XB equation. Our hand–eye calibration method consists of tool-tip pivot calibrations in two-coordinate systems, in addition to paired-point matching, where the point pairs are generated via the steady movement of the robot arm in space. After calibration, our system allows for robot-assisted, image-guided pedicle screw insertion. Comprehensive experiments are conducted to verify the efficacy of the proposed hand–eye calibration method. A mean distance deviation of 0.70 mm and a mean angular deviation of 0.68° are achieved by our system when the proposed hand–eye calibration method is used. Further experiments on drilling trajectories are conducted on plastic vertebrae as well as pig vertebrae. A mean distance deviation of 1.01 mm and a mean angular deviation of 1.11° are observed when the drilled trajectories are compared with the planned trajectories on the pig vertebrae.
In robot-assisted ultrasound-guided needle biopsy, it is essential to conduct calibration of the ultrasound probe and to perform hand–eye calibration of the robot in order to establish a link between intra-operatively acquired ultrasound images and robot-assisted needle insertion. Based on a high-precision optical tracking system, novel methods for ultrasound probe and robot hand–eye calibration are proposed. Specifically, we first fix optically trackable markers to the ultrasound probe and to the robot, respectively. We then design a five-wire phantom to calibrate the ultrasound probe. Finally, an effective method taking advantage of steady movement of the robot but without an additional calibration frame or the need to solve the AX=XB equation is proposed for hand–eye calibration. After calibrations, our system allows for in situ definition of target lesions and aiming trajectories from intra-operatively acquired ultrasound images in order to align the robot for precise needle biopsy. Comprehensive experiments were conducted to evaluate accuracy of different components of our system as well as the overall system accuracy. Experiment results demonstrated the efficacy of the proposed methods.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.