2018
DOI: 10.1109/tpami.2017.2663423
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Dimensional Sparse Models

Abstract: Traditional synthesis/analysis sparse representation models signals in a one dimensional (1D) way, in which a multidimensional (MD) signal is converted into a 1D vector. 1D modeling cannot sufficiently handle MD signals of high dimensionality in limited computational resources and memory usage, as breaking the data structure and inherently ignores the diversity of MD signals (tensors). We utilize the multilinearity of tensors to establish the redundant basis of the space of multi linear maps with the sparsity … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 47 publications
(23 citation statements)
references
References 56 publications
0
23
0
Order By: Relevance
“…In addition, as shown in Figure 2e, we give a detailed analysis for another notable representation form for the sparsity prior based on tensor sparse decomposition, which suggests that we can depict the structured sparsity of an HSI from the perspective of core tensor. Some pioneering works are presented in [42,43,[52][53][54]. Here, we draw attention to the structured sparsity formulation of an HSI under tensor sparse representation framework, thus each third-order tensor X p can be approximated by following problem:…”
Section: Non-local Structure Sparsity Modelingmentioning
confidence: 99%
See 2 more Smart Citations
“…In addition, as shown in Figure 2e, we give a detailed analysis for another notable representation form for the sparsity prior based on tensor sparse decomposition, which suggests that we can depict the structured sparsity of an HSI from the perspective of core tensor. Some pioneering works are presented in [42,43,[52][53][54]. Here, we draw attention to the structured sparsity formulation of an HSI under tensor sparse representation framework, thus each third-order tensor X p can be approximated by following problem:…”
Section: Non-local Structure Sparsity Modelingmentioning
confidence: 99%
“…where U1p, U2p, and U3p are factor matrices and S(G p ) is sparse constraint term, and we assume S(G p ) = G p 0 as suggested in [42,43,52]. However, the optimization problem based on l 0 constraint deduced by Equation 8is non-convex, the research in [53,54] further relaxes the l 0 -based core sparsity to l 1 case as S(G p ) = G p 1 . The convex optimization problem corresponding to l 1 case can be represented in Lagrangian form as following:…”
Section: Non-local Structure Sparsity Modelingmentioning
confidence: 99%
See 1 more Smart Citation
“…For multiway filtering approaches described in (5), there are roughly two strategies to encourage the sparsity of linear approximation in the transform domain: threshold core tensor C (via L0-norm [70], Wiener filter [7,71], soft-or hardthresholding [7]), and threshold factor matrices via low rank prior [72]. Directly modeling each mode of 4D data with low rank prior raises two major concerns.…”
Section: Threshold Techniquementioning
confidence: 99%
“…14(c) demonstrate the effectiveness of the twist implementation. Considering its efficiency, its performance may be improved by further modeling of tensor sparsity [70].…”
Section: ) Experiments On Urban Hsi Datamentioning
confidence: 99%