2022
DOI: 10.1007/978-3-031-20083-0_23
|View full text |Cite
|
Sign up to set email alerts
|

AdaBin: Improving Binary Neural Networks with Adaptive Binary Sets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 33 publications
(12 citation statements)
references
References 25 publications
0
12
0
Order By: Relevance
“…Competitive quantisation works endeavour to approximate model data to fewer than 4 bits. Binary neural networks perform extreme quantisation to single-bit values but suffer considerable accuracy loss in both convolution [21,27] and Transformer models [25]. Conversely, low-bit QNNs maintain high accuracy by two main techniques.…”
Section: Neural Network Quantisationmentioning
confidence: 99%
“…Competitive quantisation works endeavour to approximate model data to fewer than 4 bits. Binary neural networks perform extreme quantisation to single-bit values but suffer considerable accuracy loss in both convolution [21,27] and Transformer models [25]. Conversely, low-bit QNNs maintain high accuracy by two main techniques.…”
Section: Neural Network Quantisationmentioning
confidence: 99%
“…While RB-Net [92] presented a reshaped point-wise convolution (RPC) and balanced distribution activation (BA) for a more powerful representative ability. Also, AdaBin [93] provides adaptive binary sets for weights and activations for each layer that centers the position and distance of the distribution of the binary values to real-valued distribution. Besides, INSTA-BNN [94] determines the activation threshold value based on the difference between statistical data generated from a batch and each instance to increase the accuracy.…”
Section: C: Gradient Error Minimizationmentioning
confidence: 99%
“…In addition, modifying the network structure to improve the accuracy like ReActNet-C [87], INSTA-BNN [94], [93], BCDNet-A [95], and [101]. They have an accuracy very close to the full-precision MobileNet-v2 by a difference of less than 1%.…”
Section: ) Image Classificationmentioning
confidence: 99%
“…In reference [ 46 ], the AdaBin method was proposed to adaptively obtain the optimal binary set {b1, b2} for each layer of weights and activations. The method defined a new binary quantization function using the center position and distance of 1-bit values.…”
Section: The Key Technologies Of Embedded Aimentioning
confidence: 99%