Clinical science and medical imaging technology are traditionally displayed in two dimensions (2D) on a computer monitor. In contrast, three-dimensional (3D) virtual reality (VR) expands the realm of 2D image visualization, enabling an immersive VR experience with unhindered spatial interaction by the user. Thus far, analysis of data extracted from VR applications was mainly qualitative. In this study, we enhance VR and provide evidence for quantitative VR research by validating digital VR display of computed tomography (CT) data of the orbit. Volumetric CT data were transferred and rendered into a VR environment. Subsequently, seven graders performed repeated and blinded diameter measurements. the intergrader variability of the measurements in VR was much lower compared to measurements in the physical world and measurements were reasonably consistent with their corresponding elements in the real context. The overall VR measurements were 5.49% higher. As such, this study attests the ability of VR to provide similar quantitative data alongside the added benefit of VR interfaces. VR entails a lot of potential for the future research in ophthalmology and beyond in any scientific field that uses three-dimensional data. Orbital tumors can either be primary, arising from structures within the orbit, or secondary, due to metastatic spread of a primary tumor elsewhere 1-3. A myriad of clinical findings suggest the presence of an orbital mass, namely proptosis, diplopia, pain, conjunctival congestion and varying degrees of visual loss 4,5. Diagnosis relies on adequate clinical examination and orbital imaging like magnetic resonance imaging (MRI) and/or computed tomography (CT) 6. Despite the great progress in medical imaging, the diagnosis of orbital lesions based on neuroimaging features may lack characteristic imaging features. Often, the diagnosis of specific orbital disorders such as Graves' disease associated orbitopathy, orbital inflammation, orbital lymphoma, lacrimal gland epithelial tumor, metastatic carcinoma, and vascular orbital lesions is ambiguous 7-10. Confirmation of the diagnosis frequently relies on a biopsy which can be technically challenging and prone to complications due to the dense arrangement of tissue or inaccessibility of many of these lesions 11. Visualization of orbital lesions is still based on pre-surgery image assessment of two-dimensional (2D) data visualization on a computer monitor or on intraoperative exploration 12,13. There are increasing possibilities for presentation and interaction with virtual reality 14. In this context, virtual reality (VR) has recently been optimized as an enhanced medical image display method and showed to safeguard visual comfort 15,16. The main objective of this study was to extend current medical image display and validate the level of spatial precision in orbitometry of CT data, representing the physical world, compared to precision in three-dimensional (3D) virtual reality for the first time. This is especially important when new ways are explored to use ...
Purpose We present a feasibility study for the visuo-haptic simulation of pedicle screw tract palpation in virtual reality, using an approach that requires no manual processing or segmentation of the volumetric medical data set. Methods In a first experiment, we quantified the forces and torques present during the palpation of a pedicle screw tract in a real boar vertebra. We equipped a ball-tipped pedicle probe with a 6-axis force/torque sensor and a motion capture marker cluster. We simultaneously recorded the pose of the probe relative to the vertebra and measured the generated forces and torques during palpation. This allowed us replaying the recorded palpation movements in our simulator and to fine-tune the haptic rendering to approximate the measured forces and torques. In a second experiment, we asked two neurosurgeons to palpate a virtual version of the same vertebra in our simulator, while we logged the forces and torques sent to the haptic device. Results In the experiments with the real vertebra, the maximum measured force along the longitudinal axis of the probe was 7.78 N and the maximum measured bending torque was 0.13 Nm. In an offline simulation of the motion of the pedicle probe recorded during the palpation of a real pedicle screw tract, our approach generated forces and torques that were similar in magnitude and progression to the measured ones. When surgeons tested our simulator, the distributions of the computed forces and torques were similar to the measured ones; however, higher forces and torques occurred more frequently. Conclusions We demonstrated the suitability of direct visual and haptic volume rendering to simulate a specific surgical procedure. Our approach of fine-tuning the simulation by measuring the forces and torques that are prevalent while palpating a real vertebra produced promising results.
Purpose Understanding the properties and aspects of the robotic system is essential to a successful medical intervention, as different capabilities and limits characterize each. Robot positioning is a crucial step in the surgical setup that ensures proper reachability to the desired port locations and facilitates docking procedures. This very demanding task requires much experience to master, especially with multiple trocars, increasing the barrier of entry for surgeons in training. Methods Previously, we demonstrated an Augmented Reality-based system to visualize the rotational workspace of the robotic system and proved it helps the surgical staff to optimize patient positioning for single-port interventions. In this work, we implemented a new algorithm to allow for an automatic, real-time robotic arm positioning for multiple ports. Results Our system, based on the rotational workspace data of the robotic arm and the set of trocar locations, can calculate the optimal position of the robotic arm in milliseconds for the positional and in seconds for the rotational workspace in virtual and augmented reality setups. Conclusions Following the previous work, we extended our system to support multiple ports to cover a broader range of surgical procedures and introduced the automatic positioning component. Our solution can decrease the surgical setup time and eliminate the need to repositioning the robot mid-procedure and is suitable both for the preoperative planning step using VR and in the operating room—running on an AR headset.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.