2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2022
DOI: 10.1109/cvpr52688.2022.00813
|View full text |Cite
|
Sign up to set email alerts
|

It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
14
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 28 publications
0
14
0
Order By: Relevance
“…To verify the superiority of AdaSG, we compare it with typical DFQ approaches, including: GDFQ (Xu et al 2020), ARC (Zhu et al 2021) and Qimera (Choi et al 2021): reconstructing the original data from P; ZAQ (Liu, Zhang, and Wang 2021) focuses primarily on the adversarial sample generation rather than adversarial game process for AdaSG; IntraQ (Zhong et al 2022) optimizes the noise to obtain fake sample without a generator; AIT (Choi et al 2022) focuses on improving the loss function and manipulating the gradients for ARC to generate better sample, denoted as ARC+AIT.…”
Section: Comparison With State-of-the-artsmentioning
confidence: 99%
“…To verify the superiority of AdaSG, we compare it with typical DFQ approaches, including: GDFQ (Xu et al 2020), ARC (Zhu et al 2021) and Qimera (Choi et al 2021): reconstructing the original data from P; ZAQ (Liu, Zhang, and Wang 2021) focuses primarily on the adversarial sample generation rather than adversarial game process for AdaSG; IntraQ (Zhong et al 2022) optimizes the noise to obtain fake sample without a generator; AIT (Choi et al 2022) focuses on improving the loss function and manipulating the gradients for ARC to generate better sample, denoted as ARC+AIT.…”
Section: Comparison With State-of-the-artsmentioning
confidence: 99%
“…Enlightened by BSS [34], Qimera enabled the generator to synthesize boundary support samples through superposed latent embeddings, which improved the ability of the quantized model to identify the class boundary. In particular, AIT [9] paid no attention to the quality of synthetic samples, but it improved the training strategy of the quantized model.…”
Section: Related Workmentioning
confidence: 99%
“…Given the above considerations, data-free quantization has attracted much attention from researchers [8][9][10]. This technique can quantize a network precisely without accessing any authentic data.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The best results are reported with boldface. and Qimera [3]: reconstructing the original data from P; ZAQ [11] focuses primarily on the adversarial sample generation rather than adversarial game process for AdaDFG; IntraQ [24] optimizes the noise to obtain fake sample without a generator; AIT [4] improves the loss function and gradients for ARC to generate better sample, denoted as ARC+AIT; AdaSG [17] focuses on the zero-sum game framework, serving as a special case of AdaDFQ. Table 2 summarizes our following findings: 1) AdaDFQ obtains a significant and consistent accuracy gain over the state-of-the-arts, in line with our purpose of optimizing the margin to generate the sample with adaptive adaptability to Q (Sec.2.3).…”
Section: Comparison With State-of-the-artsmentioning
confidence: 99%