2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) 2017
DOI: 10.1109/cvprw.2017.243
|View full text |Cite
|
Sign up to set email alerts
|

Tensor Contraction Layers for Parsimonious Deep Nets

Abstract: Tensors offer a natural representation for many kinds of data frequently encountered in machine learning. Images, for example, are naturally represented as third order tensors, where the modes correspond to height, width, and channels. Tensor methods are noted for their ability to discover multi-dimensional dependencies, and tensor decompositions in particular, have been used to produce compact low-rank approximations of data. In this paper, we explore the use of tensor contractions as neural network layers an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
25
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
2
2

Relationship

1
8

Authors

Journals

citations
Cited by 40 publications
(25 citation statements)
references
References 7 publications
0
25
0
Order By: Relevance
“…In fact, HOSVD decomposition is a multidimensional extension of PCA. Some results can be seen in recent papers, such as Hu [64].…”
Section: F Various Types Of Decomposition Applicationsmentioning
confidence: 83%
“…In fact, HOSVD decomposition is a multidimensional extension of PCA. Some results can be seen in recent papers, such as Hu [64].…”
Section: F Various Types Of Decomposition Applicationsmentioning
confidence: 83%
“…A common challenge of the above technique is to determine the tensor rank. Exactly determining a tensor rank in general [49]- [51] Tensor Train [49] CP [55] Tucker [53], [55], [57] Tensor Train [52], [54], [55] CP [48] Tucker Tensor Train is NP-hard [47]. Therefore, in practice one often leverages numerical optimization or statistical techniques to obtain a reasonable rank estimation.…”
Section: Compact Deep Learning Modelsmentioning
confidence: 99%
“…We can also preserve the multi-linear structure in the activation tensor, using tensor contraction (Kossaifi et al, 2017), or by removing fully connected layers and flattening layers altogether and replacing with tensor regression layers (Kossaifi et al, 2018). Adding a stochastic regularization on the rank of the decomposition can also help render the models more robustly (Kolbeinsson et al, 2019).…”
Section: Other Modelsmentioning
confidence: 99%