2015
DOI: 10.1109/jstsp.2015.2407860
|View full text |Cite
|
Sign up to set email alerts
|

Online Sparsifying Transform Learning—Part II: Convergence Analysis

Abstract: Sparsity-based techniques have been widely popular in signal processing applications such as compression, denoising, and compressed sensing. Recently, the learning of sparsifying transforms for data has received interest. The advantage of the transform model is that it enables cheap and exact computations. In Part I of this work, efficient methods for online learning of square sparsifying transforms were introduced and investigated (by numerical experiments). The online schemes process signals sequentially, an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
20
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 43 publications
(20 citation statements)
references
References 12 publications
0
20
0
Order By: Relevance
“…In practice, depends on , and algorithm initialization, and is typically larger for bigger, or more complex problems. On the other hand, as shown in Part II [45], and in the experiments of Section IV, the online and mini-batch schemes produce good transforms for (total number of signals processed sequentially) large. Therefore, the net computational cost for processing signals (and converging) for the online scheme is .…”
Section: Comparison Of Transform Learning Schemesmentioning
confidence: 82%
See 4 more Smart Citations
“…In practice, depends on , and algorithm initialization, and is typically larger for bigger, or more complex problems. On the other hand, as shown in Part II [45], and in the experiments of Section IV, the online and mini-batch schemes produce good transforms for (total number of signals processed sequentially) large. Therefore, the net computational cost for processing signals (and converging) for the online scheme is .…”
Section: Comparison Of Transform Learning Schemesmentioning
confidence: 82%
“…As we show in this work, online transform learning involves cheap computations and modest memory requirements. Moreover, in Part II of this work [45], convergence guarantees are provided for online transform learning. Our numerical experiments illustrate the usefulness of our schemes for big data processing (online sparse representation and denoising).…”
Section: B Online Learning and Big Datamentioning
confidence: 99%
See 3 more Smart Citations