Craniosynostosis must often be corrected using surgery, by which the affected bone tissue is remodeled. Nowadays, surgical reconstruction relies mostly on the subjective judgement of the surgeon to best restore normal skull shape, since remodeled bone is manually placed and fixed. Slight variations can compromise the cosmetic outcome. The objective of this study was to describe and evaluate a novel workflow for patient-specific correction of craniosynostosis based on intraoperative navigation and 3D printing. The workflow was followed in five patients with craniosynostosis. Virtual surgical planning was performed, and patient-specific cutting guides and templates were designed and manufactured. These guides and templates were used to control osteotomies and bone remodeling. An intraoperative navigation system based on optical tracking made it possible to follow preoperative virtual planning in the operating room through real-time positioning and 3D visualization. Navigation accuracy was estimated using intraoperative surface scanning as the gold-standard. An average error of 0.62 mm and 0.64 mm was obtained in the remodeled frontal region and supraorbital bar, respectively. Intraoperative navigation is an accurate and reproducible technique for correction of craniosynostosis that enables optimal translation of the preoperative plan to the operating room.
Augmented reality (AR) can be an interesting technology for clinical scenarios as an alternative to conventional surgical navigation. However, the registration between augmented data and real-world spaces is a limiting factor. In this study, the authors propose a method based on desktop three-dimensional (3D) printing to create patient-specific tools containing a visual pattern that enables automatic registration. This specific tool fits on the patient only in the location it was designed for, avoiding placement errors. This solution has been developed as a software application running on Microsoft HoloLens. The workflow was validated on a 3D printed phantom replicating the anatomy of a patient presenting an extraosseous Ewing's sarcoma, and then tested during the actual surgical intervention. The application allowed physicians to visualise the skin, bone and tumour location overlaid on the phantom and patient. This workflow could be extended to many clinical applications in the surgical field and also for training and simulation, in cases where hard body structures are involved. Although the authors have tested their workflow on AR head mounted display, they believe that a similar approach can be applied to other devices such as tablets or smartphones.
During the last decade, orthopedic oncology has experienced the benefits of computerized medical imaging to reduce human dependency, improving accuracy and clinical outcomes. However, traditional surgical navigation systems do not always adapt properly to this kind of interventions. Augmented reality (AR) and three-dimensional (3D) printing are technologies lately introduced in the surgical environment with promising results. Here we present an innovative solution combining 3D printing and AR in orthopedic oncological surgery. A new surgical workflow is proposed, including 3D printed models and a novel AR-based smartphone application (app). This app can display the patient’s anatomy and the tumor’s location. A 3D-printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow. Experiments on six realistic phantoms provided a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients’ experience.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.