No abstract
The intricate nature of congenital heart disease requires understanding of the complex, patient-specific three-dimensional dynamic anatomy of the heart, from imaging data such as three-dimensional echocardiography for successful outcomes from surgical and interventional procedures. Conventional clinical systems use flat screens, and therefore, display remains two-dimensional, which undermines the full understanding of the three-dimensional dynamic data. Additionally, the control of three-dimensional visualisation with two-dimensional tools is often difficult, so used only by imaging specialists. In this paper, we describe a virtual reality system for immersive surgery planning using dynamic three-dimensional echocardiography, which enables fast prototyping for visualisation such as volume rendering, multiplanar reformatting, flow visualisation and advanced interaction such as three-dimensional cropping, windowing, measurement, haptic feedback, automatic image orientation and multiuser interactions. The available features were evaluated by imaging and nonimaging clinicians, showing that the virtual reality system can help improve the understanding and communication of three-dimensional echocardiography imaging and potentially benefit congenital heart disease treatment.
Funding Acknowledgements Type of funding sources: Other. Main funding source(s): NIHR i4i funded 3D Heart project Wellcome/EPSRC Centre for Medical Engineering [WT 203148/Z/16/Z] onbehalf 3D Heart Project Background/Introduction: Virtual Reality (VR) for surgical and interventional planning in the treatment of Congenital Heart Disease (CHD) is an emerging field that has the potential to improve planning. Particularly in very complex cases, VR permits enhanced visualisation and more intuitive interaction of volumetric images, compared to traditional flat-screen visualisation tools. Blood flow is severely affected by CHD and, thus, visualisation of blood flow allows direct observation of the cardiac maladaptions for surgical planning. However, blood flow is fundamentally 3D information, and viewing and interacting with it using conventional 2D displays is suboptimal. Purpose To demonstrate feasibility of blood flow visualisation in VR using pressure and velocity obtained from a computational fluid dynamic (CFD) simulation of the right ventricle in a patient with hypoplastic left heart syndrome (HLHS) as a proof of concept. Methods We extend an existing VR volume rendering application to include CFD rendering functionality using the Visualization Toolkit (VTK), an established visualisation library widely used in clinical software for visualising medical imaging data. Our prototype displays the mesh outline of the segmented heart, a slicing plane showing blood pressure on the plane within the heart, and streamlines of blood flow from a spherical source region. Existing user tools were extended to enable interactive positioning, rotation and scaling of the pressure plane and streamline origin, ensuring continuity between volume rendering and CFD interaction and, thus, ease of use. We evaluated if rendering and interaction times were low enough to ensure a comfortable, interactive VR experience. Our performance benchmark is a previous study showing VR is acceptable to clinical users when rendering speed is at least 90 fps. Results CFD simulations were successfully rendered, viewed and manipulated in VR, as shown in the Figure. Evaluating performance, we found that visualisation of the mesh and streamlines was at an acceptably high and stable frame rate, over 150fps. User interactions of moving, rotating or scaling the mesh or streamlines origin did not significantly reduce this frame rate. However, rendering the pressure slicing plane reduced frame rate by an unacceptable degree, to less than 10fps. Conclusion Visualisation of and interaction with CFD simulation data was successfully integrated into an existing VR application. This aids in surgery and intervention planning for defects heavily relying on blood flow simulation, and lays a foundation for a platform for clinicians to test interventions in VR. Pressure plane rendering performance will require significant optimisation, potentially addressed by updating the pressure plane data separately from the main, VR rendering. Abstract Figure. An example render of CFD simulation
Funding Acknowledgements Type of funding sources: Other. Main funding source(s): NIHR i4i funded 3D Heart Project Wellcome / EPSRC Centre for Medical Engineering (WT 203148/Z/16/Z) onbehalf 3D Heart Project Background/Introduction: In echocardiography (echo), image orientation is determined by the position and direction of the transducer during examination, unlike cardiovascular imaging modalities such as CT or MRI. As a result, when echo images are first shown their display orientation has no external anatomical landmarks, thus the user has to identify anatomical landmarks in the regions of interest to understand the orientation. Purpose To display an anatomical model of a standard heart, automatically aligned to an acquired patient’s 3D echo image - assisting image interpretation by quickly orienting the viewer. Methods 47 echo datasets from 13 pediatric patients with hypoplastic left heart syndrome (HLHS) were annotated by manually indicating the cardiac axes in both ES and ED volumes. We chose a view akin to the standard four chamber view in healthy hearts as the reference view, showing the AV valves, the right atrium, the left atrium and the hypoplastic ventricle. We then trained a deep convolutional neural network (CNN) to predict the rotation required for re-orientation to the reference view. Three data strategies were explored: 1) using 3D images to estimate orientation, 2) using three orthogonal slices only (2.5D approach) and 3) using the central slice only (2D approach). Three different algorithms were investigated: 1) an orientation classifier, 2) an orientation regressor with mean absolute angle error, and 3) an orientation regressor with geodesic loss. The data was split into training, validation and test sets with a 8:1:1 ratio. The training data was augmented by applying random rotations in the range [−10◦, +10◦] and updating labels accordingly. The model with smallest validation error was applied in tandem with the VR visualisation of the echo volumes. Results Experimental results suggest that a 2.5D CNN classifying discrete integer angles performs best in re-orienting volumetric images to the reference view, with a mean absolute angle error on the test set of 9.0 deg (test set error ranges from 10.8 to 25.9 deg. for other algorithms). An HLHS volumetric data (left) is automatically aligned with the cardiac model (right) using our trained network when loaded in VR as shown in Figure 1. The volume and the model are both cropped at the referencing plane. Conclusion A deep learning network to align 3D echo images to a reference view was successfully trained and then integrated into VR to reorient echo volumes to match a standard anatomical view. This work demonstrates the potential of combining artificial intelligence and VR in medical imaging, although further user study is expected to evaluate its clinical impact. Caption for Abstract Picture The VR user interface informs the user of the 3D echo image orientation, automatically aligning it with an anatomical model, here showing the four chamber apical view. Abstract Figure. Deep learning model integrated into VR
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.