2021
DOI: 10.1049/ipr2.12077
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral image, video compression using sparse tucker tensor decomposition

Abstract: Hyperspectral image and videos provide rich spectral information content, which facilitates accurate classification, unmixing, temporal change detection, and so on. However, with the rapid improvements in technology, the data size has increased many folds. To properly handle the enormous data volume, efficient methods are required to compress the data. This paper proposes a multi-way approach for compression of the hyperspectral image or video sequence. In this approach, a differential representation of the da… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(18 citation statements)
references
References 41 publications
0
18
0
Order By: Relevance
“…Based on them, many tensor-based dimensionality reduction algorithms are developed from famous vector-based dimensionality reduction, such as multilinear PCA [20], tensor marginal fisher analysis [14], tensor neighbourhood preserving embedding [13], tensor locality preserving projection [13], and so on. In recent years, some state-of-the-art tensor decomposition models or TDR algorithms [15,16,24,30,33,39,40] are proposed. Here, some related TDR work are described in detail, especially which are used in the following experiments for comparison.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Based on them, many tensor-based dimensionality reduction algorithms are developed from famous vector-based dimensionality reduction, such as multilinear PCA [20], tensor marginal fisher analysis [14], tensor neighbourhood preserving embedding [13], tensor locality preserving projection [13], and so on. In recent years, some state-of-the-art tensor decomposition models or TDR algorithms [15,16,24,30,33,39,40] are proposed. Here, some related TDR work are described in detail, especially which are used in the following experiments for comparison.…”
Section: Related Workmentioning
confidence: 99%
“…Tensor data represents multi-dimensional data, in which each dimension involves the inherent structure information of the original data. The TDR methods learn the low-dimensional representation of tensor data with various strategies [12][13][14][15][16][17][18][19].…”
Section: Introductionmentioning
confidence: 99%
“…Of note, in Bayesian frameworks and NMF-based approaches, the 3D data structure is stacked into a matrix form, which causes a loss in the neighborhood structures, smoothness, and continuity characteristics. To avoid this, tensors or multiway arrays have been frequently used in hyperspectral data analysis for the purposes of image classification [31][32][33], data compression [34,35], change detection [36,37], target and anomaly detection [36,38], and denoising [39,40]. Additionally, exploiting the capability of multilinear algebra on the multiway array representations allows more flexibility in choosing constraints and describing data structures.…”
Section: Introductionmentioning
confidence: 99%
“…Some other researchers focus on tensor decomposition for TDR. Such as Zhou et al [27] proposed an effective non-negative Tucker decomposition (NTD) to approximate the tensor by the mode product of a low-dimensional tensor with effective projection matrices; Zhang et al [28] used the low-rank regularized heterogeneous tensor decomposition (LRR-HTD) to learn a set of orthogonal factor matrices for data; Cai et al [29] proposed a regularized non-negative matrix factorization (GNMF) algorithm, which introduced an affinity graph to obtain the geometric information for seeking a factor matrix factorization; [30] applied sparse Tucker tensor decomposition to the compression of the hyperspectral image or video sequence.…”
Section: Introductionmentioning
confidence: 99%
“…[28] used the low‐rank regularized heterogeneous tensor decomposition (LRR‐HTD) to learn a set of orthogonal factor matrices for data; Cai et al. [29] proposed a regularized non‐negative matrix factorization (GNMF) algorithm, which introduced an affinity graph to obtain the geometric information for seeking a factor matrix factorization; [30] applied sparse Tucker tensor decomposition to the compression of the hyperspectral image or video sequence.…”
Section: Introductionmentioning
confidence: 99%