Magnetic resonance (MR) guided high intensity focused ultrasound and external beam radiotherapy interventions, which we shall refer to as beam therapies/interventions, are promising techniques for the non-invasive ablation of tumours in abdominal organs. However, therapeutic energy delivery in these areas becomes challenging due to the continuous displacement of the organs with respiration. Previous studies have addressed this problem by coupling high-framerate MR-imaging with a tracking technique based on the algorithm proposed by Horn and Schunck (H and S), which was chosen due to its fast convergence rate and highly parallelisable numerical scheme. Such characteristics were shown to be indispensable for the real-time guidance of beam therapies. In its original form, however, the algorithm is sensitive to local grey-level intensity variations not attributed to motion such as those that occur, for example, in the proximity of pulsating arteries.In this study, an improved motion estimation strategy which reduces the impact of such effects is proposed. Displacements are estimated through the minimisation of a variation of the H and S functional for which the quadratic data fidelity term was replaced with a term based on the linear L(1)norm, resulting in what we have called an L(2)-L(1) functional.The proposed method was tested in the livers and kidneys of two healthy volunteers under free-breathing conditions, on a data set comprising 3000 images equally divided between the volunteers. The results show that, compared to the existing approaches, our method demonstrates a greater robustness to local grey-level intensity variations introduced by arterial pulsations. Additionally, the computational time required by our implementation make it compatible with the work-flow of real-time MR-guided beam interventions.To the best of our knowledge this study was the first to analyse the behaviour of an L(1)-based optical flow functional in an applicative context: real-time MR-guidance of beam therapies in moving organs.
Medical imaging is currently employed in the diagnosis, planning, delivery and response monitoring of cancer treatments. Due to physiological motion and/or treatment response, the shape and location of the pathology and organs-at-risk may change over time. Establishing their location within the acquired images is therefore paramount for an accurate treatment delivery and monitoring. A feasible solution for tracking anatomical changes during an image-guided cancer treatment is provided by image registration algorithms. Such methods are, however, often built upon elements originating from the computer vision/graphics domain. Since the original design of such elements did not take into consideration the material properties of particular biological tissues, the anatomical plausibility of the estimated deformations may not be guaranteed. In the current work we adapt two existing variational registration algorithms, namely Horn-Schunck and EVolution, to online soft tissue tracking. This is achieved by enforcing an incompressibility constraint on the estimated deformations during the registration process. The existing and the modified registration methods were comparatively tested against several quality assurance criteria on abdominal in vivo MR and CT data. These criteria included: the Dice similarity coefficient (DSC), the Jaccard index, the target registration error (TRE) and three additional criteria evaluating the anatomical plausibility of the estimated deformations. Results demonstrated that both the original and the modified registration methods have similar registration capabilities in high-contrast areas, with DSC and Jaccard index values predominantly in the 0.8-0.9 range and an average TRE of 1.6-2.0 mm. In contrast-devoid regions of the liver and kidneys, however, the three additional quality assurance criteria have indicated a considerable improvement of the anatomical plausibility of the deformations estimated by the incompressibility-constrained methods. Moreover, the proposed registration models maintain the potential of the original methods for online image-based guidance of cancer treatments.
This study proposes a motion correction strategy for displacements resulting from slowly varying physiological motion that might occur during a MR-guided HIFU intervention. The authors have shown that such drifts can lead to a misalignment between interventional planning, energy delivery, and therapeutic validation. The presented volunteer study and in vivo experiment demonstrate both the relevance of the problem for HIFU therapies and the compatibility of the proposed motion compensation framework with the workflow of a HIFU intervention under clinical conditions.
Image registration is part of a large variety of medical applications including diagnosis, monitoring disease progression and/or treatment effectiveness and, more recently, therapy guidance. Such applications usually involve several imaging modalities such as ultrasound, computed tomography, positron emission tomography, x-ray or magnetic resonance imaging, either separately or combined. In the current work, we propose a non-rigid multi-modal registration method (namely EVolution: an edge-based variational method for non-rigid multi-modal image registration) that aims at maximizing edge alignment between the images being registered. The proposed algorithm requires only contrasts between physiological tissues, preferably present in both image modalities, and assumes deformable/elastic tissues. Given both is shown to be well suitable for non-rigid co-registration across different image types/contrasts (T1/T2) as well as different modalities (CT/MRI). This is achieved using a variational scheme that provides a fast algorithm with a low number of control parameters. Results obtained on an annotated CT data set were comparable to the ones provided by state-of-the-art multi-modal image registration algorithms, for all tested experimental conditions (image pre-filtering, image intensity variation, noise perturbation). Moreover, we demonstrate that, compared to existing approaches, our method possesses increased robustness to transient structures (i.e. that are only present in some of the images).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.