2018
DOI: 10.1007/978-3-030-00937-3_42
|View full text |Cite
|
Sign up to set email alerts
|

Learn the New, Keep the Old: Extending Pretrained Models with New Anatomy and Images

Abstract: Deep learning has been widely accepted as a promising solution for medical image segmentation, given a sufficiently large representative dataset of images with corresponding annotations. With ever increasing amounts of annotated medical datasets, it is infeasible to train a learning method always with all data from scratch. This is also doomed to hit computational limits, e.g., memory or runtime feasible for training. Incremental learning can be a potential solution, where new information (images or anatomy) i… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
43
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 47 publications
(44 citation statements)
references
References 10 publications
1
43
0
Order By: Relevance
“…consecutive slices from a 3D volume. In order to account for both of these conditions (i) and (ii), we propose to first pick a large batch of samples for which the trained model is "confident" [9], and then to prune them with the aim of maximizing how well they represent the entire dataset [4].…”
Section: Keeping Informative Samplesmentioning
confidence: 99%
See 4 more Smart Citations
“…consecutive slices from a 3D volume. In order to account for both of these conditions (i) and (ii), we propose to first pick a large batch of samples for which the trained model is "confident" [9], and then to prune them with the aim of maximizing how well they represent the entire dataset [4].…”
Section: Keeping Informative Samplesmentioning
confidence: 99%
“…Therefore, for all incremental settings for a given hold-out experiment, the validation and test sets are exactly the same, allowing us to make direct comparison between IRs of that set. 13,32,56,89,27,43,70,16,41,97,10,73,12,48,86,29,94,6,67,66,36,17,50,35,8,96,28,20,82,26,63,14,25,4,18,39,9,79,7,65,37,90,57,100,55,44,51,68,47,69,62,98,80,42,59,49,99,58,76,33,95,…”
Section: E Dataset Evaluationmentioning
confidence: 99%
See 3 more Smart Citations