2016
DOI: 10.1016/j.sigpro.2015.12.008
|View full text |Cite
|
Sign up to set email alerts
|

ℓ1-K-SVD: A robust dictionary learning algorithm with simultaneous update

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(18 citation statements)
references
References 37 publications
0
18
0
Order By: Relevance
“…This property is achieved by our multidimensional representation, as well as the orthogonality of the dictionaries. Finally, compared to the widely used 1D dictionary training algorithms [162,165,172,175,192], the AMDE has orders of magnitude smaller memory footprint.…”
Section: Summary and Future Workmentioning
confidence: 99%
“…This property is achieved by our multidimensional representation, as well as the orthogonality of the dictionaries. Finally, compared to the widely used 1D dictionary training algorithms [162,165,172,175,192], the AMDE has orders of magnitude smaller memory footprint.…”
Section: Summary and Future Workmentioning
confidence: 99%
“…This stage is analogous to the sparse coding stage in the dictionary learning and TL algorithms [4][5][6][7][8][12][13][14][15][16][17]. In this stage, the coefficients of representation and hence the probability distribution of the representation of the signals are identified by iteratively minimising the entropy of representation.…”
Section: Stage Imentioning
confidence: 99%
“…This paper discusses methods that generate a sparsifying transform for a class of signals from the signals itself. The proposals in this paper are motivated by the work reported in [2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21]. This section describes the proposals in [2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][20][21] briefly.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…With the rise of big data and the rapid development of artificial intelligence, compressive sensing (CS) has been widely used and has obtained state-of-the-art results in multiple fields such as machine learning, neuroscience, signal processing, image and audio processing, classification, and statistics [23]- [25] because its features are flexibility, sparse and super-resolution. CS mainly consists two parts including sparse coding and dictionary design.…”
Section: Introductionmentioning
confidence: 99%