2020 Tenth International Conference on Image Processing Theory, Tools and Applications (IPTA) 2020
DOI: 10.1109/ipta50016.2020.9286712
|View full text |Cite
|
Sign up to set email alerts
|

Data augmentation for multi-organ detection in medical images

Abstract: We propose a deep learning solution to the problem of object detection in 3D medical images, i.e. the localization and classification of multiple structures. Supervised learning methods require large annotated datasets that are usually difficult to acquire. We thus develop a Cycle Generative Adversarial Network (CycleGAN) and You Only Look Once (YOLO) combined method for data augmentation from one modality to another via CycleGAN and organ detection from generated images via YOLO. This results in a fast and ac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…Therefore, although our pipeline slightly deviates from the ground truth, a fully automated approach reduces measurement variation error and therefore allows for smaller changes to be measured over time. We also acknowledge that other deep learning segmentation models such as YOLO [ 36 ] and DeepLab [ 37 ] should be explored. Future work should investigate the performance of each for this application.…”
Section: Discussionmentioning
confidence: 99%
“…Therefore, although our pipeline slightly deviates from the ground truth, a fully automated approach reduces measurement variation error and therefore allows for smaller changes to be measured over time. We also acknowledge that other deep learning segmentation models such as YOLO [ 36 ] and DeepLab [ 37 ] should be explored. Future work should investigate the performance of each for this application.…”
Section: Discussionmentioning
confidence: 99%