Purpose To create a network which fully utilizes multi‐sequence MRI and compares favorably with manual human contouring. Methods We retrospectively collected 89 MRI studies of the pelvic cavity from patients with prostate cancer and cervical cancer. The dataset contained 89 samples from 87 patients with a total of 84 valid samples. MRI was performed with T1‐weighted (T1), T2‐weighted (T2), and Enhanced Dixon T1‐weighted (T1DIXONC) sequences. There were two cohorts. The training cohort contained 55 samples and the testing cohort contained 29 samples. The MRI images in the training cohort contained contouring data from radiotherapist α. The MRI images in the testing cohort contained contouring data from radiotherapist α and contouring data from another radiotherapist: radiotherapist β. The training cohort was used to optimize the convolution neural networks, which included the attention mechanism through the proposed activation module and the blended module into multiple MRI sequences, to perform autodelineation. The testing cohort was used to assess the networks’ autodelineation performance. The contoured organs at risk (OAR) were the anal canal, bladder, rectum, femoral head (L), and femoral head (R). Results We compared our proposed network with UNet and FuseUNet using our dataset. When T1 was the main sequence, we input three sequences to segment five organs and evaluated the results using four metrics: the DSC (Dice similarity coefficient), the JSC (Jaccard similarity coefficient), the ASD (average mean distance), and the 95% HD (robust Hausdorff distance). The proposed network achieved improved results compared with the baselines among all metrics. The DSC were 0.834±0.029, 0.818±0.037, and 0.808±0.050 for our proposed network, FuseUNet, and UNet, respectively. The 95% HD were 7.256±2.748 mm, 8.404±3.297 mm, and 8.951±4.798 mm for our proposed network, FuseUNet, and UNet, respectively. Our proposed network also had superior performance on the JSC and ASD coefficients. Conclusion Our proposed activation module and blended module significantly improved the performance of FuseUNet for multi‐sequence MRI segmentation. Our proposed network integrated multiple MRI sequences efficiently and autosegmented OAR rapidly and accurately. We also discovered that three‐sequence fusion (T1‐T1DIXONC‐T2) was superior to two‐sequence fusion (T1‐T2 and T1‐T1DIXONC, respectively). We infer that the more MRI sequences fused, the better the automatic segmentation results.
Interactive segmentation enables users to segment as needed by providing cues of objects, which introduces human-computer interaction for many fields, such as image editing and medical image analysis. Typically, massive and expansive pixel-level annotations are spent to train deep models by object-oriented interactions with manually labeled object masks. In this work, we reveal that informative interactions can be made by simulation with semantic-consistent yet diverse region exploration in an unsupervised paradigm. Concretely, we introduce a Multigranularity Interaction Simulation (MIS) approach to open up a promising direction for unsupervised interactive segmentation. Drawing on the high-quality dense features produced by recent self-supervised models, we propose to gradually merge patches or regions with similar features to form more extensive regions and thus, every merged region serves as a semantic-meaningful multi-granularity proposal. By randomly sampling these proposals and simulating possible interactions based on them, we provide meaningful interaction at multiple granularities to teach the model to understand interactions. Our MIS significantly outperforms non-deep learning unsupervised methods and is even comparable with some previous deep-supervised methods without any annotation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.