2020
DOI: 10.48550/arxiv.2004.02867
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Rethinking Spatially-Adaptive Normalization

Zhentao Tan,
Dongdong Chen,
Qi Chu
et al.
Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(7 citation statements)
references
References 27 publications
(69 reference statements)
0
7
0
Order By: Relevance
“…Normalization layers without track running statistics introduce extra computation cost, and we take this into account for our calculation of MACs during pruning. Moreover, for GauGAN, we use synchronized batch normalization as suggested by previous work [58,67], and remove the spectral norm [55] as we find it does not have much impact on the model performance.…”
Section: Acknowledgementmentioning
confidence: 99%
“…Normalization layers without track running statistics introduce extra computation cost, and we take this into account for our calculation of MACs during pruning. Moreover, for GauGAN, we use synchronized batch normalization as suggested by previous work [58,67], and remove the spectral norm [55] as we find it does not have much impact on the model performance.…”
Section: Acknowledgementmentioning
confidence: 99%
“…We compare INADE with several SOTA works, including quality-oriented (pix2pixHD [42], SPADE [30], CLADE [38,37] and SEAN [51]) and diversity-oriented (BicycleGAN [50], DSCGAN [44], and GroupDNet [52]) methods. For a fair comparison, we directly use the pretrained models provided by the authors when available, otherwise train the models by ourselves using the codes and settings provided by the authors.…”
Section: Quantitative and Qualitative Comparisonsmentioning
confidence: 99%
“…We adopt this step to facilitate supervised training and enable test-time reference-based style guidance. Inspired by [30,38,37], the proposed method is called INADE (INstance-Adaptive DEnormalization).…”
Section: Introductionmentioning
confidence: 99%
“…Other choices have also been proposed to remedy the issue. For isntance, CLADE [66] proposed to use class-adaptive modulation parameters instead. CC-FPSE [47] employed spatially-varying convolutional weights instead of the spatially-varying nor-malization layers.…”
Section: Related Workmentioning
confidence: 99%