Focused ultrasound (FUS) therapy has emerged as a promising non-invasive solution for tumor ablation. Accurate monitoring and guidance of ultrasound energy is crucial for effective FUS treatment. Although ultrasound (US) imaging is a well-suited modality for FUS monitoring, US-guided FUS (USgFUS) faces challenges in achieving precise monitoring, leading to unpredictable ablation shapes and a lack of quantitative measurement. To address these challenges, we propose an artificial intelligence (AI)-assisted USgFUS framework that integrates an AI segmentation framework with ultrasound B-mode imaging for quantitative and real-time monitoring of FUS treatment. The AI framework can accurately identify and label ablated areas in the B-mode images captured during and after each FUS sonication procedure in real-time. To assess the feasibility of our proposed method, we developed an AI segmentation framework based on the Swin-Unet architecture and conducted an in vitro experimental study using a USgFUS setup and chicken breast tissue. The results indicated that the developed AI segmentation framework could immediately label the ablated tissue areas with \(93\%\) accuracy. These findings suggest that AI-assisted ultrasound monitoring can significantly improve the precision and accuracy of FUS treatments, suggesting a crucial advancement towards the development of more effective FUS treatment strategies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.