Proceedings of the 26th Annual International Conference on Machine Learning 2009
DOI: 10.1145/1553374.1553463
|View full text |Cite
|
Sign up to set email alerts
|

Online dictionary learning for sparse coding

Abstract: Sparse coding-that is, modelling data vectors as sparse linear combinations of basis elements-is widely used in machine learning, neuroscience, signal processing, and statistics. This paper focuses on learning the basis set, also called dictionary, to adapt it to specific data, an approach that has recently proven to be very effective for signal reconstruction and classification in the audio and image processing domains. This paper proposes a new online optimization algorithm for dictionary learning, based on … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

6
1,444
0
10

Year Published

2011
2011
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 1,828 publications
(1,460 citation statements)
references
References 21 publications
6
1,444
0
10
Order By: Relevance
“…The dictionary is learned by the algorithm in [62], and the sparse codes are learned using orthogonal matching pursuit (OMP) [62]. The parameter λ is set 0.15.…”
Section: Sparse Codingmentioning
confidence: 99%
“…The dictionary is learned by the algorithm in [62], and the sparse codes are learned using orthogonal matching pursuit (OMP) [62]. The parameter λ is set 0.15.…”
Section: Sparse Codingmentioning
confidence: 99%
“…Here α(x, D) is the sparse representation for the pair (x, D), and can be done fast using orthogonal matching pursuit (OMP) [49] for which we can leverage the already existing efficient implementations [50,51]. Note that OMP is a greedy algorithm and we resort to it for reasons related to efficiency rather than correctness or exact recovery.…”
Section: Formulationmentioning
confidence: 99%
“…A recent approach [26] has been proposed to iteratively build a dictionary by processing one element (or a small subset) of the training set at a time. As mentioned by the authors: such online approach "is particularly important in the context of image and video processing [35], where it is common to learn dictionaries adapted to small patches, with training data that may include several millions of these patches (roughly one per pixel and per frame)".…”
Section: Iterative Dictionary Buildingmentioning
confidence: 99%
“…However, Mairal et al [26] approach suffers from excessive computational cost regarding active learning requirements. Indeed, during the dictionary update steps, the new dictionary D t is computed by minimizing a surrogate cost function of the empirical cost function (under some constraints) using both the new annotated data x t and its decomposition α t over the dictionary D t−1 obtained at the previous iteration.…”
Section: Iterative Dictionary Buildingmentioning
confidence: 99%
See 1 more Smart Citation