2021
DOI: 10.3390/iot2020012
|View full text |Cite
|
Sign up to set email alerts
|

ThriftyNets: Convolutional Neural Networks with Tiny Parameter Budget

Abstract: Deep Neural Networks are state-of-the-art in a large number of challenges in machine learning. However, to reach the best performance they require a huge pool of parameters. Indeed, typical deep convolutional architectures present an increasing number of feature maps as we go deeper in the network, whereas spatial resolution of inputs is decreased through downsampling operations. This means that most of the parameters lay in the final layers, while a large portion of the computations are performed by a small f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 34 publications
0
1
0
Order By: Relevance
“…Wu et al (2018) proposed a method of clustering parameters using the k-means algorithm and sharing weights through k clustering centers and a weight distribution index. Coiffier et al (2021) proposed a ThriftyNets network structure. The ThriftyNets adopted a cyclically used convolutional layer, which can compress the number of parameters and improve the efficiency of parameter utilization.…”
Section: Introductionmentioning
confidence: 99%
“…Wu et al (2018) proposed a method of clustering parameters using the k-means algorithm and sharing weights through k clustering centers and a weight distribution index. Coiffier et al (2021) proposed a ThriftyNets network structure. The ThriftyNets adopted a cyclically used convolutional layer, which can compress the number of parameters and improve the efficiency of parameter utilization.…”
Section: Introductionmentioning
confidence: 99%