2017
DOI: 10.1109/tci.2017.2697206
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Sum of Outer Products Dictionary Learning (SOUP-DIL) and Its Application to Inverse Problems

Abstract: The sparsity of signals in a transform domain or dictionary has been exploited in applications such as compression, denoising and inverse problems. More recently, data-driven adaptation of synthesis dictionaries has shown promise compared to analytical dictionary models. However, dictionary learning problems are typically non-convex and NP-hard, and the usual alternating minimization approaches for these problems are often computationally expensive, with the computations dominated by the NP-hard synthesis spar… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
63
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 37 publications
(63 citation statements)
references
References 78 publications
(239 reference statements)
0
63
0
Order By: Relevance
“…Tensor-structured (patch-based) dictionary learning has also been exploited recently for dynamic CT [120] and spectral CT [121] reconstructions. 4) Recent Efficient Dictionary Learning-Based Methods: Recent work proposed efficient dictionary learning-based reconstruction algorithms, dubbed SOUP-DIL image reconstruction algorithms [122] that used the following regularizer:…”
Section: B Synthesis Dictionary Learning-based Approaches For Reconsmentioning
confidence: 99%
See 1 more Smart Citation
“…Tensor-structured (patch-based) dictionary learning has also been exploited recently for dynamic CT [120] and spectral CT [121] reconstructions. 4) Recent Efficient Dictionary Learning-Based Methods: Recent work proposed efficient dictionary learning-based reconstruction algorithms, dubbed SOUP-DIL image reconstruction algorithms [122] that used the following regularizer:…”
Section: B Synthesis Dictionary Learning-based Approaches For Reconsmentioning
confidence: 99%
“…While the earlier DL-MRI used inexact (greedy) and expensive sparse code updates and lacked convergence analysis, the SOUP-DIL scheme used efficient, exact updates and was proved to converge to the critical points (generalized stationary points) of the underlying problems and improved image quality over several schemes [122]. Fig.…”
Section: B Synthesis Dictionary Learning-based Approaches For Reconsmentioning
confidence: 99%
“…Recent works have also developed efficient synthesis dictionary learning-based reconstruction schemes such as SOUP-DIL MRI [28], and LASSI [22] that uses a low-rank + learned dictionary-sparse model for dynamic MRI. The most recent trend involves supervised (e.g., deep) learning of MRI models such as those based on convolutional neural networks [10,[29][30][31][32].…”
Section: B Data-driven or Learning-based Models For Reconstructionmentioning
confidence: 99%
“…The patch size used in STL-MRI was 8 × 8 for a fair comparison to STROLLR-MRI. The DL-MRI settings are as used in[28]. DL-MRI, STL-MRI, and STROLLR-MRI follow the BCS settings.…”
mentioning
confidence: 99%
“…where P j ∈ C n×M is the matrix whose columns contain the patches P l x j for 1 ≤ l ≤ M , and c t i is the ith column of C t . We use a block coordinate descent approach [47] (with few iterations) to update the sparse coefficients c t i and atoms d i (columns of D) in (3) sequentially. For each 1 ≤ i ≤ m, we first minimize (3) with respect to c t i keeping the other variables fixed (sparse coding step), and then we update d i keeping the other variables fixed (dictionary atom update step).…”
Section: A Dictionary Learning Step For (P1)mentioning
confidence: 99%