Introduction: Medical training is a long and demanding process, in which the first stages are usually based on two-dimensional, static, and unrealistic content. Conversely, advances in preoperative imaging have made it an essential part of any successful surgical procedure. However, access to this information often requires the support of an assistant and may compromise sterility in the surgical process. Herein, we present two solutions based on mixed reality that aim to improve both training and planning in minimally invasive surgery.Materials and Methods: Applications were developed for the use of the Microsoft HoloLens device. The urology training application provided access to a variety of anatomical and surgical training contents. Expert urological surgeons completed a questionnaire to evaluate its use. The surgical planning solution was used during laparoscopic renal tumorectomy in an experimental model and video-assisted right upper lobectomy in an adult patient. Surgeons reported their experience using this preoperative planning tool for surgery.Results: The solution developed for medical training was considered a useful tool for training in urological anatomy, facilitating the translation of this knowledge to clinical practice. Regarding the solution developed for surgical planning, it allowed surgeons to access the patient’s clinical information in real-time, such as preoperative imaging studies, three-dimensional surgical planning models, or medical history, facilitating the surgical approach. The surgeon’s view through the mixed reality device was shared with the rest of the surgical team.Conclusions: The mixed reality-based solution for medical training facilitates the transfer of knowledge into clinical practice. The preoperative planning tool for surgery provides real-time access to essential patient information without losing the sterility of the surgical field. However, further studies are needed to comprehensively validate its clinical application.
INTRODUCTION The adoption of surgical planning systems could greatly simplify the performance of video-assisted thoracoscopic surgery and improve its safety. MATERIAL AND METHODS A new application for surgical planning in video-assisted thoracoscopic surgery was developed, making use of the HoloLens (v1) mixed reality device. The information was displayed in the form of interactive holograms, by means of gestural, visual or voice control. The application was validated during a video-assisted right upper lobectomy surgery, including systematic lymphadenectomy for squamous cell carcinoma in the right upper lobe. RESULTS No complications were shown during surgery. Prior to surgery, the system allowed the surgeon to access in real time the patient's medical history, review the computed tomography study, and visualize and manipulate a 3D model of the lung with its respective vascular and bronchial elements, as well as the tumor to be removed. The surgeon’s vision using the device was easily shared with the rest of the surgical team. The surgeon placed the holographic models with the surgical planning information behind the field of view of the operating table for possible reference during the procedure. The weight and heat generated by the device were considered ergonomic aspects to be improved. CONCLUSIONS The system provides real-time access to important patient information for surgical planning during video-assisted lobectomy surgery, without losing the sterility of the surgical act. The surgeon's view can be shared for communication and learning purposes, as well as recorded for later review of surgical complications.
Wearable technology is an emerging field that has the potential to revolutionize healthcare. Advances in sensors, augmented reality devices, the internet of things, and artificial intelligence offer clinically relevant and promising functionalities in the field of surgery. Apart from its well-known benefits for the patient, minimally invasive surgery (MIS) is a technically demanding surgical discipline for the surgeon. In this regard, wearable technology has been used in various fields of application in MIS such as the assessment of the surgeon’s ergonomic conditions, interaction with the patient or the quality of surgical performance, as well as in providing tools for surgical planning and assistance during surgery. The aim of this chapter is to provide an overview based on the scientific literature and our experience regarding the use of wearable technology in MIS, both in experimental and clinical settings.
Introduction Microsurgery involves procedures that demand maximum precision and dexterity, as well as the absence of tremor. In this regard, new robotic-assisted microsurgery solutions have emerged in recent years. The objective of this review is to analyze the current status of the robotic platforms in microsurgery. Methods This systematic review has been focused on the PubMed database over the last seven years (2015–2022), using keywords related to robotic platforms for microsurgery. Articles including clinical/preclinical studies were considered, and non-English language articles, abstracts, conference proceedings, reviews and letters were excluded. Results A total of 324 articles were identified, 27 of which passed the eligibility criteria. Of the 14 platforms analyzed, da Vinci, MM3 and the easyMicro prototype incorporate their own vision system. The majority of the systems have tremor filtering and provide haptic feedback. The da Vinci system is the most widely used system in vascular and nerve microsurgery, reconstructive and lymph node dissection, eye surgery and urology. The MUSA and Symani systems have been used in reconstructive supermicrosurgery, the neuroarm in neurosurgery, the Surgical Robotic System and the easyMicro prototype in reconstructive and ocular surgery. The MM-3 (vascular), SMART (ocular) and smartARM (neurosurgery) have been tested in artificial models. Conclusions Among the platforms analyzed, da Vinci was the most widely used, specifically in ocular and reconstructive surgery. MUSA and neuroARM are rapidly progressing in reconstructive microsurgery and neurosurgery, respectively.
Introduction The objective of this study is to perform an initial evaluation of a system for assessment and visual feedback of the surgeon's posture during laparoscopic practice. Methods In order to monitor the surgeon's posture, we used the XSens motion tracking system. The neck, shoulder, elbow, and wrist joints were analyzed during laparoscopic practice. A visual feedback method was designed using Unity3D which indicates in real time if any of the analyzed joints suffers from an inadequate posture, allowing for its possible correction. The system was evaluated with a group of seven novice laparoscopic surgeons who performed several repetitions of two basic training tasks (eye-hand coordination and transfer). The participants were randomly organized into group A (4 subjects), who received visual feedback, and group B (3 subjects), as a control group. Results Regarding the coordination task, group A exhibited lower ergonomically inappropriate posture time for the left shoulder and elbow, being significant for the left shoulder posture. Similarly, this group showed a positive evolution during the training process for the different joints. Regarding the transfer task, both groups presented a positive evolution. Conclusions Preliminary results show that the presented method allows novice surgeons to improve their posture during laparoscopic practice. The use of the non-dominant hand and the complexity of the task seem to be conditioning aspects in the surgeon's ergonomics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.