2021
DOI: 10.1109/lcomm.2021.3102319
|View full text |Cite
|
Sign up to set email alerts
|

Realizing Neural Decoder at the Edge With Ensembled BNN

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…Before that let us first discuss the number of parameters and FLOPs in a single convolution layer. The memory and FLOPs for 1D-CNN and 2D-CNN were given in [27] and [21] respectively. Considering L layers in a ResNet architecture, the trainable parameters are given by φ = {W 1 , .…”
Section: Saving In Memory and Computation Using Rrelumentioning
confidence: 99%
“…Before that let us first discuss the number of parameters and FLOPs in a single convolution layer. The memory and FLOPs for 1D-CNN and 2D-CNN were given in [27] and [21] respectively. Considering L layers in a ResNet architecture, the trainable parameters are given by φ = {W 1 , .…”
Section: Saving In Memory and Computation Using Rrelumentioning
confidence: 99%