2018
DOI: 10.1109/jstars.2017.2779539
|View full text |Cite
|
Sign up to set email alerts
|

Hyperspectral Image Restoration Via Total Variation Regularized Low-Rank Tensor Decomposition

Abstract: Hyperspectral images (HSIs) are often corrupted by a mixture of several types of noise during the acquisition process, e.g., Gaussian noise, impulse noise, dead lines, stripes, and many others. Such complex noise could degrade the quality of the acquired HSIs, limiting the precision of the subsequent processing. In this paper, we present a novel tensor-based HSI restoration approach by fully identifying the intrinsic structures of the clean HSI part and the mixed noise part respectively. Specifically, for the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
245
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
4

Relationship

2
7

Authors

Journals

citations
Cited by 366 publications
(245 citation statements)
references
References 49 publications
0
245
0
Order By: Relevance
“…µ can be initially set as 10 −2 and updated in each iteration by µ := min(ηµ, µ max ), where η = 1.1. As introduced in [45], [47], [48], γ can be set as 100/ √ I 1 I 2 . In our experiences, α is selected in a range from 0 to 0.2 and β can be chosen between 0 to 10.…”
Section: B Parameter Selectionmentioning
confidence: 99%
“…µ can be initially set as 10 −2 and updated in each iteration by µ := min(ηµ, µ max ), where η = 1.1. As introduced in [45], [47], [48], γ can be set as 100/ √ I 1 I 2 . In our experiences, α is selected in a range from 0 to 0.2 and β can be chosen between 0 to 10.…”
Section: B Parameter Selectionmentioning
confidence: 99%
“…Before introducing PMLSA, we consider how to obtain −1 j in (5). For the univariate model Y j = XB j +E j , Koenker [23] indicated that the asymptotic covariance of the estimationB j has the common property…”
Section: A Modelsmentioning
confidence: 99%
“…In particular, multi-dimensional arrays (i.e., tensors) provide a natural representation form for these data. Tensor, which is regarded as a multi-linear generalization of matrix/vector, can mathematically model the multi-dimensional data structures, making tensor learning so attractive that there are increasing applications in computer vision [1][2][3][4], machine learning [5,6], signal processing [7,8], and pattern recognition [9] in recent years. Unfortunately, the challenge of missing elements in the actually observed tensors limits its applications.…”
Section: Introductionmentioning
confidence: 99%
“…Obviously, the difference among the existing LRTC models mainly lies in the choice of f . Although various tensor rank surrogates [3,6,18] are proposed to approximate the tensor rank, they all face some challenges in practical applications. According to the CP decomposition [15], Friedland et al [18] claimed that the CP-rank could be relaxed it with CP tensor nuclear norm (CNN), which is a convex surrogate that can be defined as:…”
Section: Introductionmentioning
confidence: 99%