Proceedings of the ACM International Conference on Supercomputing 2021
DOI: 10.1145/3447818.3459988
|View full text |Cite
|
Sign up to set email alerts
|

ClickTrain

Abstract: Convolutional neural networks (CNNs) are becoming increasingly deeper, wider, and non-linear because of the growing demand on prediction accuracy and analysis quality. The wide and deep CNNs, however, require a large amount of computing resources and processing time. Many previous works have studied model pruning to improve inference performance, but little work has been done for effectively reducing training cost. In this paper, we propose Click-Train: an efficient and accurate end-to-end training and pruning… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(1 citation statement)
references
References 46 publications
0
1
0
Order By: Relevance
“…RigL [90], ITOP [91], SET [104], DSR [89], and MEST [86], is provided in Tab. Three main sparsity schemes introduced in the area of network pruning consists of unstructured [105][106][107], structured [3,45,[108][109][110][111][112][113][114][115][116][117][118][119], and fine-grained structured pruning [120][121][122][123][124][125][126][127][128][129].…”
Section: Discussionmentioning
confidence: 99%
“…RigL [90], ITOP [91], SET [104], DSR [89], and MEST [86], is provided in Tab. Three main sparsity schemes introduced in the area of network pruning consists of unstructured [105][106][107], structured [3,45,[108][109][110][111][112][113][114][115][116][117][118][119], and fine-grained structured pruning [120][121][122][123][124][125][126][127][128][129].…”
Section: Discussionmentioning
confidence: 99%