2021
DOI: 10.1016/j.neunet.2021.09.012
|View full text |Cite
|
Sign up to set email alerts
|

ACSL: Adaptive correlation-driven sparsity learning for deep neural network compression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 34 publications
0
3
0
Order By: Relevance
“…The neural networks, we know, can approximate any function, which can accomplish a predetermined goal. Consequently, various types neural networks have been applied in a large number of fields (see earlier studies [1][2][3][4][5][6][7][8]). The application of integrated circuit chip with neuron unit to the field of artificial intelligence is a good example.…”
Section: Introductionmentioning
confidence: 99%
“…The neural networks, we know, can approximate any function, which can accomplish a predetermined goal. Consequently, various types neural networks have been applied in a large number of fields (see earlier studies [1][2][3][4][5][6][7][8]). The application of integrated circuit chip with neuron unit to the field of artificial intelligence is a good example.…”
Section: Introductionmentioning
confidence: 99%
“…Chapter 5 is submitted to journal in [2] as Wei He, Meiqing Wu, Siew-Kei Lam. The explicit contributions of the co-authors are listed below:…”
Section: Authorship Attribution Statementmentioning
confidence: 99%
“…The work described in this chapter has been submitted for publication as [2] Algorithm 1 ACSL for channel pruning.…”
Section: Chapter 5 Adaptive Correlation-driven Sparsity Learning For ...mentioning
confidence: 99%