Background and purpose Treatment planning of radiotherapy for locally advanced breast cancer patients can be a time consuming process. Artificial intelligence based treatment planning could be used as a tool to speed up this process and maintain plan quality consistency. The purpose of this study was to create treatment plans for locally advanced breast cancer patients using a Convolutional Neural Network (CNN). Materials and methods Data of 60 patients treated for left-sided breast cancer was used with a training, validation and test split of 36/12/12, respectively. The in-house built CNN model was a hierarchically densely connected U-net (HD U-net). The inputs for the HD U-net were 2D distance maps of the relevant regions of interest. Dose predictions, generated by the HD U-net, were used for a mimicking algorithm in order to create clinically deliverable plans. Results Dose predictions were generated by the HD U-net and mimicked using a commercial treatment planning system. The predicted plans fulfilling all clinical goals while showing small (≤0.5 Gy) statistically significant differences (p < 0.05) in the doses compared to the manual plans. The mimicked plans show statistically significant differences in the average doses for the heart and lung of ≤0.5 Gy and a reduced D2% of all PTVs. In total, ten of the twelve mimicked plans were clinically acceptable. Conclusions We created a CNN model which can generate clinically acceptable plans for left-sided locally advanced breast cancer patients. This model shows great potential to speed up the treatment planning process while maintaining consistent plan quality.
Preterm infants in a neonatal intensive care unit (NICU) are continuously monitored for their vital signs, such as heart rate and oxygen saturation. Body motion patterns are documented intermittently by clinical observations. Changing motion patterns in preterm infants are associated with maturation and clinical events such as late-onset sepsis and seizures. However, continuous motion monitoring in the NICU setting is not yet performed. Video-based motion monitoring is a promising method due to its non-contact nature and therefore unobtrusiveness. This study aims to determine the feasibility of simple video-based methods for infant body motion detection. We investigated and compared four methods to detect the motion in videos of infants, using two datasets acquired with different types of cameras. The thermal dataset contains 32 hours of annotated videos from 13 infants in open beds. The RGB dataset contains 9 hours of annotated videos from 5 infants in incubators. The compared methods include background substruction (BS), sparse optical flow (SOF), dense optical flow (DOF), and oriented FAST and rotated BRIEF (ORB). The detection performance and computation time were evaluated by the area under receiver operating curves (AUC) and run time. We conducted experiments to detect motion and gross motion respectively. In the thermal dataset, the best performance of both experiments is achieved by BS with mean (standard deviation) AUCs of 0.86 (0.03) and 0.93 (0.03). In the RGB dataset, SOF outperforms the other methods in both experiments with AUCs of 0.82 (0.10) and 0.91 (0.05). All methods are efficient to be integrated into a camera system when using low-resolution thermal cameras.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.