2021
DOI: 10.1007/978-3-030-78191-0_1
|View full text |Cite
|
Sign up to set email alerts
|

HyperMorph: Amortized Hyperparameter Learning for Image Registration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
45
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 85 publications
(45 citation statements)
references
References 29 publications
0
45
0
Order By: Relevance
“…In this framework, the parameters of the hypernetwork, and not the main network, are learned. While originally introduced for achieving weightsharing and model compression [25], this idea has found numerous applications including neural architecture search [26], [27], Bayesian neural networks [28], [29], multi-task learning [30]- [34], and hyperparameter optimization [35], [36].…”
Section: Hypernetworkmentioning
confidence: 99%
See 2 more Smart Citations
“…In this framework, the parameters of the hypernetwork, and not the main network, are learned. While originally introduced for achieving weightsharing and model compression [25], this idea has found numerous applications including neural architecture search [26], [27], Bayesian neural networks [28], [29], multi-task learning [30]- [34], and hyperparameter optimization [35], [36].…”
Section: Hypernetworkmentioning
confidence: 99%
“…al. use this method to enable test-time rapid tunability in image registration for medical imaging [36]. Our previous conference work applied this idea to the AO setting for CS-MRI [9].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…We obtain the preprocessed dataset from [38]. The MRI scans were preprocessed [39] using Freesurfer [40] by standard steps like resampling, bias correction, skull stripping, affine normalization and center cropping into volumes of 160 × 192 × 224. For our experiments we split the dataset into training, validation and test set of sizes 255, 15 and 144 respectively.…”
Section: A Datamentioning
confidence: 99%
“…Parallel to our work, Hoopes et al [13] propose to learn the effects of registration hyperparameters on deformation field with Hypernetworks [10], which leverage a secondary network to generate the conditioned weights for the entire network layers. While the Hypernetworks-based method offers immense modulation potential, it adds an enormous number of parameters to the original image registration method.…”
Section: Introductionmentioning
confidence: 99%