2021
DOI: 10.1007/s40747-021-00308-x
|View full text |Cite
|
Sign up to set email alerts
|

Heuristic rank selection with progressively searching tensor ring network

Abstract: Recently, tensor ring networks (TRNs) have been applied in deep networks, achieving remarkable successes in compression ratio and accuracy. Although highly related to the performance of TRNs, rank selection is seldom studied in previous works and usually set to equal in experiments. Meanwhile, there is not any heuristic method to choose the rank, and an enumerating way to find appropriate rank is extremely time-consuming. Interestingly, we discover that part of the rank elements is sensitive and usually aggreg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
32
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
3
1

Relationship

2
8

Authors

Journals

citations
Cited by 24 publications
(32 citation statements)
references
References 19 publications
0
32
0
Order By: Relevance
“…Tensorized decomposition methods substitute a high-dimensional tensor with the product of multiple low-dimensional tensors. These are Tucker decomposition (TD), canonical polyadic decomposition (CPD), tensor train (TT), and tensor ring (TR) in the tensor decomposition [ 25 , 26 , 27 , 28 ]. In tensor decomposition, the lower the rank of decomposed tensors, the lower the amount of computation, but the lower the accuracy of the network.…”
Section: Related Researchmentioning
confidence: 99%
“…Tensorized decomposition methods substitute a high-dimensional tensor with the product of multiple low-dimensional tensors. These are Tucker decomposition (TD), canonical polyadic decomposition (CPD), tensor train (TT), and tensor ring (TR) in the tensor decomposition [ 25 , 26 , 27 , 28 ]. In tensor decomposition, the lower the rank of decomposed tensors, the lower the amount of computation, but the lower the accuracy of the network.…”
Section: Related Researchmentioning
confidence: 99%
“…The previous study [18] uses two TTD methods in the experiments, but the results are very close without any indepth analysis. Even tensor-ring, a Train-TTD-like SOTA method, reports that it suffers from a 0.88% accuracy drop with a 2.2× compression ratio for ResNet32 on Cifar-10 [29]. Moreover, accuracy degradation can be worse on deep MLPs because their structures and blocks essentially differ from RNNs and CNNs.…”
Section: B Tensor-train For Deep Neural Networkmentioning
confidence: 99%
“…Classic DNNs can be classified into five categories based on different compression techniques [1] : pruning [2] , quantization [3] , knowledge distillation [4] , low-rank decomposition [5] , and network architecture search [6] . Low-rank decomposition allows for extracting Low-rank features from original tensor by tensor decomposition.…”
Section: Introductionmentioning
confidence: 99%