2019
DOI: 10.1109/jstars.2019.2896031
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral Image Denoising via Subspace-Based Nonlocal Low-Rank and Sparse Factorization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
56
0

Year Published

2019
2019
2025
2025

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 69 publications
(56 citation statements)
references
References 53 publications
0
56
0
Order By: Relevance
“…Gaussian. When noise is a mixture of Gaussian noise, stripes, and impulse noise, the spectral subspace is usually estimated jointly with subspace coefficients of the HSI; for example, in double-factor-regularized low-rank tensor factorization (LRTF-DFR) [21] and subspace-based nonlocal low-rank and sparse factorization (SNLRSF) [22]. The joint estimation of the subspace and the corresponding coefficients of the HSI usually produce poor estimates of the subspace when HSI is affected by severe mixed noise.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Gaussian. When noise is a mixture of Gaussian noise, stripes, and impulse noise, the spectral subspace is usually estimated jointly with subspace coefficients of the HSI; for example, in double-factor-regularized low-rank tensor factorization (LRTF-DFR) [21] and subspace-based nonlocal low-rank and sparse factorization (SNLRSF) [22]. The joint estimation of the subspace and the corresponding coefficients of the HSI usually produce poor estimates of the subspace when HSI is affected by severe mixed noise.…”
Section: Related Workmentioning
confidence: 99%
“…The non-convex rank constraint is usually relaxed by minimizing the nuclear norm of HSIs, as done in the low-rank matrix recovery (LRMR) method [15] and in the weighted sum of weighted tensor nuclear norm minimization (WSWTNNM) method [16]. The low-rank structure of HSIs is also exploited by representing the spectral vectors of the clean image in an orthogonal subspace in [8,[17][18][19][20] for Gaussian noise removal and in [21,22] for mixed noise removal. Subspace representation is an explicit lowrank representation in the sense that the rank is constrained by the dimension of subspace.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, the image prior method based on nonlocal self-similarity [ 13 , 14 ] and low rank matrix approximating [ 15 , 16 , 17 , 18 ] can better preserve image edge details while denoising, which has achieved some success in image denoising [ 19 , 20 ]. Low rank matrix approximation aims to recover the underlying low rank matrix from degraded observations.…”
Section: Introductionmentioning
confidence: 99%
“…However, in practice, a system of linear equations with matching unknowns are very rare and often ends up being sparse. The sparse solution of the equations also find numerous applications, especially in image processing [25], [9], [4], [17]. The field of sparse representation elicits its structure from the conventional transforms which ensures simplicity of processing, efficiency of representation, speed etc.…”
Section: Introductionmentioning
confidence: 99%
“…For a quantitative analysis of the obtained results, the algorithm uses PSNR, ISNR and SSIM and these values for cameraman, football and lena images each with dimensions 20 × 20, 40 × 40 and 60 × 60 are tabulated in the Tables(3)(4)(5). Due to space constraint, only tabulated results for Cameraman are presented.…”
mentioning
confidence: 99%