2020
DOI: 10.1109/tip.2020.2984849
|View full text |Cite
|
Sign up to set email alerts
|

NPSA: Nonorthogonal Principal Skewness Analysis

Abstract: Principal skewness analysis (PSA) has been introduced for feature extraction in hyperspectral imagery. As a thirdorder generalization of principal component analysis (PCA), its solution of searching for the locally maximum skewness direction is transformed into the problem of calculating the eigenpairs (the eigenvalues and the corresponding eigenvectors) of a coskewness tensor. By combining a fixed-point method with an orthogonal constraint, it can prevent the new eigenpairs from converging to the same maxima … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 48 publications
0
10
0
Order By: Relevance
“…The purpose of BIS is to estimate the mixing matrix, denoted as B . To evaluate the separation performance of the PPSA algorithm, we compare it with algorithms such as FastICA [15], PSA [17], MPSA [18], NPSA [21], and MSDP [25]. In this experiment, this paper selects n grayscale images with a size of 256 × 256 pixels as the source image, where n is 2~6 respectively.…”
Section: Experiments 1: Blind Image Separationmentioning
confidence: 99%
See 2 more Smart Citations
“…The purpose of BIS is to estimate the mixing matrix, denoted as B . To evaluate the separation performance of the PPSA algorithm, we compare it with algorithms such as FastICA [15], PSA [17], MPSA [18], NPSA [21], and MSDP [25]. In this experiment, this paper selects n grayscale images with a size of 256 × 256 pixels as the source image, where n is 2~6 respectively.…”
Section: Experiments 1: Blind Image Separationmentioning
confidence: 99%
“…In order to quantitatively evaluate the performance of the above six algorithms, we use five evaluation indicators to evaluate the separation results obtained by the above six algorithms. The five evaluation indicators are intersymbol interference (ISI) [21], total mean square error (TMSE) [21], correlation coefficient (  ) [21], peak signal-to-noise ratio (PSNR) [21] and running time (T). It is worth noting that except for the PPSA algorithm and the MSDP algorithm, the results obtained by the other four algorithms are random, so we take the average of 10 runs as the final result.…”
Section: Experiments 1: Blind Image Separationmentioning
confidence: 99%
See 1 more Smart Citation
“…There are many algorithms and numerous applications that have been investigated concerning this subject. see for example [11,12,13,14,15,16,17,18,19]. In this paper, we mainly investigate the Z-eigenpairs problem of a special class of symmetric tensors, which is orthogonally diagonalizable.…”
Section: Introductionmentioning
confidence: 99%
“…Tensor decomposition of moment and cumulant tensors are used in a variety of statistical and data science applications, including independent component analysis and blind source separation [10,13,15], clustering [34,12], learning Gaussian mixture models [23,4,20,18,33], latent variable models [2,3], outlier detection [16,1], feature extraction in hyperspectral imagery [19], and multireference alignment [32]. In these cases, it is assumed that the empirical higher-order moment is already computed.…”
mentioning
confidence: 99%