2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016
DOI: 10.1109/cvpr.2016.637
|View full text |Cite
|
Sign up to set email alerts
|

TenSR: Multi-dimensional Tensor Sparse Representation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
51
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 41 publications
(51 citation statements)
references
References 31 publications
0
51
0
Order By: Relevance
“…In addition, as shown in Figure 2e, we give a detailed analysis for another notable representation form for the sparsity prior based on tensor sparse decomposition, which suggests that we can depict the structured sparsity of an HSI from the perspective of core tensor. Some pioneering works are presented in [42,43,[52][53][54]. Here, we draw attention to the structured sparsity formulation of an HSI under tensor sparse representation framework, thus each third-order tensor X p can be approximated by following problem:…”
Section: Non-local Structure Sparsity Modelingmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, as shown in Figure 2e, we give a detailed analysis for another notable representation form for the sparsity prior based on tensor sparse decomposition, which suggests that we can depict the structured sparsity of an HSI from the perspective of core tensor. Some pioneering works are presented in [42,43,[52][53][54]. Here, we draw attention to the structured sparsity formulation of an HSI under tensor sparse representation framework, thus each third-order tensor X p can be approximated by following problem:…”
Section: Non-local Structure Sparsity Modelingmentioning
confidence: 99%
“…where U1p, U2p, and U3p are factor matrices and S(G p ) is sparse constraint term, and we assume S(G p ) = G p 0 as suggested in [42,43,52]. However, the optimization problem based on l 0 constraint deduced by Equation 8is non-convex, the research in [53,54] further relaxes the l 0 -based core sparsity to l 1 case as S(G p ) = G p 1 . The convex optimization problem corresponding to l 1 case can be represented in Lagrangian form as following:…”
Section: Non-local Structure Sparsity Modelingmentioning
confidence: 99%
See 1 more Smart Citation
“…To solve (5), one may use an N -mode iterative soft thresholding approach, such as FISTA [3], or TISTA [16], as outlined in Algorithm 1. Note that the 1 -proximity operator may be computed easily as a soft thresholding operator…”
Section: Sparse Tucker Tensor Reconstructionmentioning
confidence: 99%
“…Two different kinds of sparse coding models have been proposed to preserve the intrinsic spatial structures of images: tensor sparse coding (TenSR) [5,6] and convolutional sparse coding (CSC) [7,8]. For TenSR models [5,6], tensors are exploited for the image representation and a series of separable dictionaries are used to approximate the structures in each mode of images. Though the spatial structures are pre- served by tensor representations, the relationships between the learned sparse coefficients and dictionaries are more complicated, which will cause the encoding (sparse coefficients) hard to interpret.…”
Section: Introductionmentioning
confidence: 99%