2022
DOI: 10.1109/tpami.2022.3204203
|View full text |Cite
|
Sign up to set email alerts
|

Exact Decomposition of Joint Low Rankness and Local Smoothness Plus Sparse Matrices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
1
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 26 publications
(5 citation statements)
references
References 65 publications
0
1
0
Order By: Relevance
“…Wang et al [19] integrated the spatial-spectral joint TV into low-Tucker-rank tensor decomposition. Besides, there has also emerged several transform-based TV variants, such as group TV [34], E3DTV [35], CTV [36], and t-CTV [37], gaining improved denoising performance. Except for the local prior, the nonlocal self-similarity is more widely used in many works for normal Gaussian denoising tasks [17], [20], [21], [22], [23].…”
Section: A Nsr-unrelated Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Wang et al [19] integrated the spatial-spectral joint TV into low-Tucker-rank tensor decomposition. Besides, there has also emerged several transform-based TV variants, such as group TV [34], E3DTV [35], CTV [36], and t-CTV [37], gaining improved denoising performance. Except for the local prior, the nonlocal self-similarity is more widely used in many works for normal Gaussian denoising tasks [17], [20], [21], [22], [23].…”
Section: A Nsr-unrelated Workmentioning
confidence: 99%
“…We first conduct denoising experiments on simulated datasets with ground truth to quantitatively evaluate the performance of the proposed NS3R method. The following 13 related methods are used for comparison, which can be divided into two categories corresponding to the related works in Section II, i.e., the NSR -unrelated methods, including BM4D [55], LRMR [10], E3DTV [35], CTV [36], TDL [17], LLRT [20], KBR [21], and RCTV [42], and NSR -related methods, including FastHyDe [25], SNLRSF [27], NGmeet [28], GLF [26], and TenSRDe [29]. They are mostly representative of the state-of-the-art for HSI denoising.…”
Section: A Simulated Experimentsmentioning
confidence: 99%
“…4) Fused Low-Rank and Local Smoothness Prior: Recently, there has been a trend to combine gradient sparsity and lowrank priors into a single regularization term [50]- [52]. One typical method is the Representative Coefficient Total Variation (RCTV), which operates within the matrix factorization framework, formulated as X = UV T in matrix notation, or expressed as X = U × 3 V in tensor notation.…”
Section: B Internal Prior Modelingmentioning
confidence: 99%
“…The compared methods include tensor dictionary learning (TDL) 2 [48], TCTV 3 [51], LMHTV 4 [39], LTHTV 5 [39], LRTV 6 [16], non-local meets global (NGMeet) 7 [69], RCTV 8 [52], weighted non-local low-rank model with adaptive TV regularization (WNLRATV) 9 [44], BALMF 10 [11], and CTV 11 [50]. Besides these internal information based methods, it also compares deep learning methods, including RC image learnable denoiser (RCILD) 12 [70], and hierarchical low-rank tensor factorization (HLRTF) 13 [71].…”
Section: B Experiments On Synthetic Datasetsmentioning
confidence: 99%
“…To address the challenges, most conventional approaches borrow certain hand-crafted priors to enforce the reconstructed images enjoy truthful visual properties. For example, techniques such as sparse representation (Wang et al 2023a), low rankness (Peng et al 2023), edge sharpness (Huang et al 2022), and non-local similarity (Wen et al 2023) have been extensively investigated and yield commendable performance. These solutions incrementally ameliorate image quality through iterative updates, which often suffer from the expense of massive execution time and the risk of oversmoothed recoveries.…”
Section: Introductionmentioning
confidence: 99%