2021
DOI: 10.48550/arxiv.2104.08215
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

"BNN - BN = ?": Training Binary Neural Networks without Batch Normalization

Abstract: Batch normalization (BN) is a key facilitator and considered essential for state-of-the-art binary neural networks (BNN). However, the BN layer is costly to calculate and is typically implemented with non-binary parameters, leaving a hurdle for the efficient implementation of BNN training. It also introduces undesirable dependence between samples within each batch. Inspired by the latest advance on Batch Normalization Free (BN-Free) training [7], we extend their framework to training BNNs, and for the first ti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 64 publications
(153 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?