In magnetic resonance imaging (MRI), epicardial adipose tissue (EAT) overload remains often overlooked due to tedious manual contouring in images. Automated four-chamber EAT area quantification was proposed, leveraging deep-learning segmentation using multi-frame fully convolutional networks (FCN). The investigation involved 100 subjects—comprising healthy, obese, and diabetic patients—who underwent 3T cardiac cine MRI, optimized U-Net and FCN (noted FCNB) were trained on three consecutive cine frames for segmentation of central frame using dice loss. Networks were trained using 4-fold cross-validation (n = 80) and evaluated on an independent dataset (n = 20). Segmentation performances were compared to inter-intra observer bias with dice (DSC) and relative surface error (RSE). Both systole and diastole four-chamber area were correlated with total EAT volume (r = 0.77 and 0.74 respectively). Networks’ performances were equivalent to inter-observers’ bias (EAT: DSCInter = 0.76, DSCU-Net = 0.77, DSCFCNB = 0.76). U-net outperformed (p < 0.0001) FCNB on all metrics. Eventually, proposed multi-frame U-Net provided automated EAT area quantification with a 14.2% precision for the clinically relevant upper three quarters of EAT area range, scaling patients’ risk of EAT overload with 70% accuracy. Exploiting multi-frame U-Net in standard cine provided automated EAT quantification over a wide range of EAT quantities. The method is made available to the community through a FSLeyes plugin.
Quantitative analysis of abdominal organs motion and deformation is crucial to better understand biomechanical alterations undermining respiratory, digestive or perineal pathophysiology. In particular, biomechanical characterization of the antero‐lateral abdominal wall is central in the diagnosis of abdominal muscle deficiency. Here, we present a dedicated semiautomatic dynamic MRI postprocessing method enabling the quantification of spatial and temporal deformations of the antero‐lateral abdominal wall muscles. Ten healthy participants were imaged during a controlled breathing session at the L3–L4 disc level using real‐time dynamic MRI at 3 T. A coarse feature‐tracking step allowed the selection of the inhalation cycle of maximum abdominal excursion. Over this image series, the described method combines (1) a supervised 2D+t segmentation procedure of the abdominal wall muscles, (2) the quantification of muscle deformations based on masks registration, and (3) the mapping of deformations within muscle subzones leveraging a dedicated automatic parcellation. The supervised 2D+t segmentation (1) provided an accurate segmentation of the abdominal wall muscles throughout maximum inhalation with a 0.95 ± 0.03 Dice similarity coefficient (DSC) value and a 2.3 ± 0.7 mm Hausdorff distance value while requiring only manual segmentation of 20% of the data. The robustness of the deformation quantification (2) was indicated by high indices of correspondence between the registered source mask and the target mask (0.98 ± 0.01 DSC value and 2.1 ± 1.5 mm Hausdorff distance value). Parcellation (3) enabled the distinction of muscle substructures that are anatomically relevant but could not be distinguished based on image contrast. The present genuine postprocessing method provides a quantitative analytical frame that could be used in further studies for a better understanding of abdominal wall deformations in physiological and pathological situations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.