2017
DOI: 10.1109/tsp.2017.2733447
|View full text |Cite
|
Sign up to set email alerts
|

Working Locally Thinking Globally: Theoretical Guarantees for Convolutional Sparse Coding

Abstract: Abstract-The celebrated sparse representation model has led to remarkable results in various signal processing tasks in the last decade. However, despite its initial purpose of serving as a global prior for entire signals, it has been commonly used for modeling low dimensional patches due to the computational constraints it entails when deployed with learned dictionaries. A way around this problem has been recently proposed, adopting a convolutional sparse representation model. This approach assumes that the g… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
106
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 104 publications
(106 citation statements)
references
References 50 publications
(117 reference statements)
0
106
0
Order By: Relevance
“…The P 0,∞ problem. A new prior for convolutional sparse coding was recently proposed in [33]. Rather than minimizing the 0 "norm" (the total number of nonzero coefficients in the convolutional representation), this model minimizes the 0,∞ group "norm".…”
Section: Of D Computed Efficiently Using Convolutions With Columns Of Dmentioning
confidence: 99%
See 3 more Smart Citations
“…The P 0,∞ problem. A new prior for convolutional sparse coding was recently proposed in [33]. Rather than minimizing the 0 "norm" (the total number of nonzero coefficients in the convolutional representation), this model minimizes the 0,∞ group "norm".…”
Section: Of D Computed Efficiently Using Convolutions With Columns Of Dmentioning
confidence: 99%
“…The work in [33] provides guarantees for the success of the standard OMP and 1 relaxation in the ideal and noisy regimes. It proposes optimization methods for solving an approximation of (5) using the Alternating Direction Method of Multipliers (ADMM) [6] to minimize the 1 norm of the representation with additional penalties, which bias the solution towards a small 0,∞ "norm".…”
Section: Of D Computed Efficiently Using Convolutions With Columns Of Dmentioning
confidence: 99%
See 2 more Smart Citations
“…The convolutional sparse representation model [27], [28], [29] where the dictionary is a concatenation of circulant matrices has been extensively studied in the past. Furthermore, recent work [13] The author is with the Istituto Italiano di Tecnologia (IIT), Genova, Italy. Contact e-mail address: cristian.rusu@iit.it.…”
Section: Introductionmentioning
confidence: 99%