2016
DOI: 10.1109/tsp.2016.2540599
|View full text |Cite
|
Sign up to set email alerts
|

Trainlets: Dictionary Learning in High Dimensions

Abstract: Sparse representations has shown to be a very powerful model for real world signals, and has enabled the development of applications with notable performance. Combined with the ability to learn a dictionary from signal examples, sparsity-inspired algorithms are often achieving state-of-the-art results in a wide variety of tasks. Yet, these methods have traditionally been restricted to small dimensions mainly due to the computational constraints that the dictionary learning problem entails. In the context of im… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
74
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 88 publications
(74 citation statements)
references
References 51 publications
0
74
0
Order By: Relevance
“…Here we extend the comparative experiments considering R-SVD (integrating OMP) versus the iterative alternating scheme method ILS-DLA, and the on-line dictionary learning method OSDL by Sulam et al [24]. …”
Section: Experimental Analysismentioning
confidence: 96%
See 2 more Smart Citations
“…Here we extend the comparative experiments considering R-SVD (integrating OMP) versus the iterative alternating scheme method ILS-DLA, and the on-line dictionary learning method OSDL by Sulam et al [24]. …”
Section: Experimental Analysismentioning
confidence: 96%
“…Then we apply the method on synthetic data conducting extensive experiments on both R-SVD and K-SVD using OMP as sparsifier. A further investigation is conducted on two different sparse decomposition methods, namely k -Limaps and SL0, and alternative dictionary learning methods, namely ILS-DLA by Engan et al [20] and OSDL by Sulam et al [24]. …”
Section: Experimental Analysismentioning
confidence: 99%
See 1 more Smart Citation
“…Thus the patch size used in dictionary learning should not be large, which, however, poses a limitation to the image size that we can handle. Nevertheless, efficient dictionary learning methods dealing with images of larger scales have recently been proposed [46][47][48], in which the handled image size can go beyond 64 × 64 pixels. To demonstrate the effectiveness of the proposed method for images of larger size, we carry out simulations over the LFWcrop database [49], which consists of more than 13, 000 images of 64 × 64 pixels.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, due to the computational complexity of the problem, the majority of the learning algorithms have been restricted to work with relatively small signals. Recent works [4] tried to address the latter issue by observing that learned dictionaries have a certain sparse structure, which can be efficiently utilised. These novel algorithms are capable of handling signals at the dimension of several thousands, and learn on millions of data signals.…”
Section: Introductionmentioning
confidence: 99%