2017 IEEE International Conference on Image Processing (ICIP) 2017
DOI: 10.1109/icip.2017.8296573
|View full text |Cite
|
Sign up to set email alerts
|

Online convolutional dictionary learning

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
37
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 23 publications
(37 citation statements)
references
References 14 publications
0
37
0
Order By: Relevance
“…Another aspect of this advantage is our ability to run in an online manner, even for a single input image. This stands in sharp contrast to other recent online methods [30,31] which allow for online training but only in the case of streaming images. Other approaches took a step further and proposed partitioning the image into smaller sub-images [32], but this is still far from [32] and the SBDL algorithm [1].…”
Section: Relation To Other Methodsmentioning
confidence: 66%
“…Another aspect of this advantage is our ability to run in an online manner, even for a single input image. This stands in sharp contrast to other recent online methods [30,31] which allow for online training but only in the case of streaming images. Other approaches took a step further and proposed partitioning the image into smaller sub-images [32], but this is still far from [32] and the SBDL algorithm [1].…”
Section: Relation To Other Methodsmentioning
confidence: 66%
“…The Fast Iterative Shrinkage-Thresholding Algorithm (FISTA) [33], an accelerated proximal gradient method, has been used for CSC [6], [5], [19], and in a recent online CDL algorithm [18], but has not previously been considered for the dictionary update of a batch-mode dictionary learning algorithm.…”
Section: D / Frequency Domain Consensusmentioning
confidence: 99%
“…All the results using the methods discussed and analyzed in the main document were computed using the Python implementation of the SPORCO library [36], [ also include comparisons with the method proposed by Papyan et al [27], using their publicly available Matlab and C implementation 16 . We tried to include the publicly available Matlab implementations of the methods proposed by Šorel and Šroubek 17 [24] and by Heide et al 18 [9] in these comparisons, but were unable 16 to obtain acceptable results 19 . We therefore omit these methods from the comparisons here, including them only in a separate set of experiments on a smaller data set, reported in Sec.…”
Section: Sv Large Training Set Experimentsmentioning
confidence: 99%
“…Thus, Theorem 1 also holds for Algorithm 3, and a stationary point of problem (15) can be obtained. E. Discussion with [27], [28] Here, we discuss the very recent works of [27], [28] which also consider online learning of the dictionary in CSC. Extra experiments are performed in Section B, which shows our method is much faster than them.…”
Section: Convergencementioning
confidence: 99%
“…However, though their O(·) are the same, storing sparse matrix requires 2-3 times more space than storing dense matrix with the same number of nonzero entries, as each nonzero entry in a sparse matrix needs to be kept in the compressed sparse row (CSR) 3 format [30]. Moreover, though [28] and the proposed method take O(N K 2 P ) time to update the history matrices, using sparse matrices as in [28] is empirically slower [30]. Preliminary experiments show that with K = 100 filters, the proposed algorithm is 5 times faster on an P = 100 × 100 image and 10 times faster on an P = 200 × 200 image.…”
Section: Convergencementioning
confidence: 99%