Medical Imaging 2021: Image-Guided Procedures, Robotic Interventions, and Modeling 2021
DOI: 10.1117/12.2579256
|View full text |Cite
|
Sign up to set email alerts
|

Neural network pruning for biomedical image segmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…• https://github.com/naver/force An interesting, albeit less common, application for pruning models is within the context of segmentation. In a recent paper Jeong et al (2021) train and prune the U-Net (Ronneberger et al, 2015) architecture on two image datasets from the Cell Tracking Challenge (PhC-C2DH-U373 and DIC-C2DH-HeLa). They use the classic multi-step approach of gradually applying magnitude-pruning interleaved with fine-tuning stages.…”
Section: A3 Implementationsmentioning
confidence: 99%
See 1 more Smart Citation
“…• https://github.com/naver/force An interesting, albeit less common, application for pruning models is within the context of segmentation. In a recent paper Jeong et al (2021) train and prune the U-Net (Ronneberger et al, 2015) architecture on two image datasets from the Cell Tracking Challenge (PhC-C2DH-U373 and DIC-C2DH-HeLa). They use the classic multi-step approach of gradually applying magnitude-pruning interleaved with fine-tuning stages.…”
Section: A3 Implementationsmentioning
confidence: 99%
“…To evaluate the flexibility of our method we used meta-gradients at the beginning of training (on a randomly initialized U-Net), prune in a single shot, and train the network once for the same number of epochs (50). We kept the training set-up the same as the baseline by Jeong et al (2021) (i.e., resizing images and segmentation maps to (256,256), setting aside 30% of training data for validation) and similarly aim to find the highest prune ratio that does not result in IOU degradation. We report intersection-over-union (IOU) metric for the two datasets in Tables 8 and 9: These results show that our method works as well (or better) compared to this compute-expensive baseline, in the sense that we can prune more parameters while keeping the IOU score the same.…”
Section: A3 Implementationsmentioning
confidence: 99%