2021 IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
DOI: 10.1109/wacv48630.2021.00148
|View full text |Cite
|
Sign up to set email alerts
|

MeliusNet: An Improved Network Architecture for Binary Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 31 publications
(12 citation statements)
references
References 20 publications
0
12
0
Order By: Relevance
“…We study the performance of DIR-Net over ResNet-18, ResNet-34, MobileNetV1, DARTS, and EfficientNet-B0 structures on the large-scale ImageNet dataset. Table 6 lists the comparison with several SOTA quantization methods, including BWN [68], HWGQ [50], TWN [48], LQ-Net [87], DoReFa-Net [90], ABC-Net [51], Bi-Real [57], XNOR++ [6], BWHN [36], SQ-BWN and SQ-TWN [19], PCNN [26], BONN [27], Si-BNN [76], Real-to-Bin [59], MeliusNet [5], and ReActNet [56].…”
Section: Imagenetmentioning
confidence: 99%
See 1 more Smart Citation
“…We study the performance of DIR-Net over ResNet-18, ResNet-34, MobileNetV1, DARTS, and EfficientNet-B0 structures on the large-scale ImageNet dataset. Table 6 lists the comparison with several SOTA quantization methods, including BWN [68], HWGQ [50], TWN [48], LQ-Net [87], DoReFa-Net [90], ABC-Net [51], Bi-Real [57], XNOR++ [6], BWHN [36], SQ-BWN and SQ-TWN [19], PCNN [26], BONN [27], Si-BNN [76], Real-to-Bin [59], MeliusNet [5], and ReActNet [56].…”
Section: Imagenetmentioning
confidence: 99%
“…6(b)). Fourth, we compare proposed DIR-Net with more SOTA binarization approaches in Table 6 (BONN [27], Si-BNN [76], PCNN [26], Real-to-Bin [59], MeliusNet [5], and ReActNet [56]) and evaluate it on compact networks (EfficientNet [73], MobileNet [34] and DARTS [54]). The results show that our DIR-Net is versatile and effective, and can improve the performance on these structures.…”
Section: Introductionmentioning
confidence: 99%
“…Since the proposal of BNNs [1], [2], there have been significant efforts committed to closing the accuracy gap between BNNs and the full-precision DNNs. These include adding learnable scaling factors (also known as gain term) [33]- [36], adopting multiple basis [33]- [35], [37], designing BNN-oriented network structure [16], [18], [23], [38], and reforming BNN training methodologies [22], [23], [36], [39]- [41]. More detailed information can be found in the BNN surveys [42], [43].…”
Section: A Binary Neural Networkmentioning
confidence: 99%
“…Though BNNs significantly alleviate the computation cost, they suffer from non-trivial performance degradation largely due to the imperfection of the binarization function and its derivative approximation [15], [16]. Most existing BNN works binarize the inputs and weights into binary (+1/-1) tensors through a deterministic sign function.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation