Medical Imaging 2020: Image Processing 2020
DOI: 10.1117/12.2549716
|View full text |Cite
|
Sign up to set email alerts
|

Incorporating minimal user input into deep learning based image segmentation

Abstract: Computer-assisted image segmentation techniques could help clinicians to perform the border delineation task faster with lower inter-observer variability. Recently, convolutional neural networks (CNNs) are widely used for automatic image segmentation. In this study, we used a technique to involve observer inputs for supervising CNNs to improve the accuracy of the segmentation performance. We added a set of sparse surface points as an additional input to supervise the CNNs for more accurate image segmentation. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…For tasks such as this, i.e. object detection of unnatural shapes, the U-Net fails to perform satisfactorily [24]. The Unbalanced-UNet solves this problem by limiting the feature extraction.…”
Section: Region Proposals For the Spine Using Unbalanced-unetmentioning
confidence: 99%
“…For tasks such as this, i.e. object detection of unnatural shapes, the U-Net fails to perform satisfactorily [24]. The Unbalanced-UNet solves this problem by limiting the feature extraction.…”
Section: Region Proposals For the Spine Using Unbalanced-unetmentioning
confidence: 99%
“…While interactions are easy and fast to set, physicians are always first presented with an auto-segmentation for annotation, which could bias the delineation process. An approach where the physician provides a limited amount of surface points [19] or a few slice contours to guide the auto-segmentation might prevent this segmentation bias.…”
Section: Introductionmentioning
confidence: 99%