2022 IEEE 40th International Conference on Computer Design (ICCD) 2022
DOI: 10.1109/iccd56317.2022.00072
|View full text |Cite
|
Sign up to set email alerts
|

LightNorm: Area and Energy-Efficient Batch Normalization Hardware for On-Device DNN Training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 7 publications
0
3
0
Order By: Relevance
“…BN layers are used to resolve issues such as gradient disappearance and gradient explosion in the network during the training process. However, during the forward propagation process, additional intermediate variables trueZ${\bm{\tilde{Z}}}$ are generated and the model's efficiency is reduced due to the need to calculate Equations (11)–(13) in sequence [25].…”
Section: Sparse Methods For Deep Learning Modelsmentioning
confidence: 99%
“…BN layers are used to resolve issues such as gradient disappearance and gradient explosion in the network during the training process. However, during the forward propagation process, additional intermediate variables trueZ${\bm{\tilde{Z}}}$ are generated and the model's efficiency is reduced due to the need to calculate Equations (11)–(13) in sequence [25].…”
Section: Sparse Methods For Deep Learning Modelsmentioning
confidence: 99%
“…Unlike the previous two works discussed, ref. [ 13 ] proposes an accelerator that focuses on BN. This work has two contributions; the first is a reduced complexity BN layer named LightNorm.…”
Section: Introductionmentioning
confidence: 99%
“…To evaluate these, a reference point is necessary. In addition to pre-existing work from [ 6 , 11 , 12 , 13 , 14 , 15 ] used as a basis, training is performed with the aid of TensorFlow [ 16 ] and a BNN extension, Larq [ 4 ]. Network weights and parameters are then exported for further processing using the methodology elaborated in the subsequent section.…”
Section: Introductionmentioning
confidence: 99%