Robot-assisted surgery is becoming popular in the operation room (OR) for, e.g., orthopedic surgery (among other surgeries). However, robotic executions related to surgical steps cannot simply rely on preoperative plans. Using pedicle screw placement as an example, extra adjustments are needed to adapt to the intraoperative changes when the preoperative planning is outdated. During surgery, adjusting a surgical plan is non-trivial and typically rather complex since the available interfaces used in current robotic systems are not always intuitive to use. Recently, thanks to technical advancements in head-mounted displays (HMD), augmented reality (AR)-based medical applications are emerging in the OR. The rendered virtual objects can be overlapped with real-world physical objects to offer intuitive displays of the surgical sites and anatomy. Moreover, the potential of combining AR with robotics is even more promising; however, it has not been fully exploited. In this paper, an innovative AR-based robotic approach is proposed and its technical feasibility in simulated pedicle screw placement is demonstrated. An approach for spatial calibration between the robot and HoloLens 2 without using an external 3D tracking system is proposed. The developed system offers an intuitive AR–robot interaction approach between the surgeon and the surgical robot by projecting the current surgical plan to the surgeon for fine-tuning and transferring the updated surgical plan immediately back to the robot side for execution. A series of bench-top experiments were conducted to evaluate system accuracy and human-related errors. A mean calibration error of 3.61 mm was found. The overall target pose error was 3.05 mm in translation and 1.12∘ in orientation. The average execution time for defining a target entry point intraoperatively was 26.56 s. This work offers an intuitive AR-based robotic approach, which could facilitate robotic technology in the OR and boost synergy between AR and robots for other medical applications.
During laparoscopic sacrocolpopexy, pelvic organ prolapse is repaired by suturing one side of a synthetic mesh around the vaginal vault while stapling the other end to the sacrum, restoring the anatomical position of the vagina. A perineal assistant positions and tensions the vault with a vaginal manipulator instrument to properly expose the vaginal tissue to the laparoscopic surgeon. A technical difficulty during this surgery is the loss of depth perception due to visualization of the patient's internals on a 2D screen. Especially during precise surgical tasks, a more natural way to understand the distance between the laparoscopic instruments and the surgical region of interest could be advantageous. This work describes an exploratory study to investigate the potential of introducing 3D visualization into this surgical intervention. More in particular, experimentation is conducted with autostereoscopic display technology. A mixed reality setup was constructed featuring a virtual reality model of the vagina, 2D and 3D visualization, a physical interface representing the tissue of the body wall and a tracking system to track instrument motion. An experiment was conducted whereby the participants had to navigate the instrument to a number of pre-defined locations under 2D or 3D visualization. Compared to 2D, a considerable reduction in average task time (−42.9 %), travelled path lenght (−31.8 %) and errors (−52.2 %) was observed when performing the experiment in 3D. Where this work demonstrated a potential benefit of autostereoscopic visualization with respect to 2D visualization, in future work we wish to investigate if there also exists a benefit when comparing this technology with conventional stereoscopic visualization and whether stereoscopy can be used for (semi-) automated guidance during robotic laparoscopy.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.