2019
DOI: 10.1109/tip.2019.2922074
|View full text |Cite
|
Sign up to set email alerts
|

An $\alpha$ -Divergence-Based Approach for Robust Dictionary Learning

Abstract: In this paper, a robust sequential dictionary learning (DL) algorithm is presented. The proposed algorithm is motivated from the maximum likelihood perspective on dictionary learning and its link to the minimization of the Kullback-Leibler divergence. It is obtained by using a robust loss function in the data fidelity term of the DL objective instead of the usual quadratic loss. The proposed robust loss function is derived from the αdivergence as an alternative to the Kullback-Leibler divergence, which leads t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 28 publications
(9 citation statements)
references
References 40 publications
0
9
0
Order By: Relevance
“…In image and video processing, where it is common to learn dictionaries adapted to small patches, with training data that may include several millions of these patches, Mairal et al [19] proposed an online dictionary learning (ODL) algorithm based on stochastic approximations, which scales up to large datasets with millions of training samples handling also dynamic training data changing over time, such as video sequences. In the same context of image processing, Iqbal et al [23] proposed a DL algorithm, which minimizes the assumption on the noise by using a function derived from the α-divergence, which is used in the data fidelity term of the objective function instead of the quadratic loss or the Frobenius norm. The algorithm is applied on various image processing applications, such as digit recognition, background removal, and grayscale image denoising.…”
Section: Related Workmentioning
confidence: 99%
“…In image and video processing, where it is common to learn dictionaries adapted to small patches, with training data that may include several millions of these patches, Mairal et al [19] proposed an online dictionary learning (ODL) algorithm based on stochastic approximations, which scales up to large datasets with millions of training samples handling also dynamic training data changing over time, such as video sequences. In the same context of image processing, Iqbal et al [23] proposed a DL algorithm, which minimizes the assumption on the noise by using a function derived from the α-divergence, which is used in the data fidelity term of the objective function instead of the quadratic loss or the Frobenius norm. The algorithm is applied on various image processing applications, such as digit recognition, background removal, and grayscale image denoising.…”
Section: Related Workmentioning
confidence: 99%
“…The α-divergences are widely used in information sciences, see [3,8,38,22,17,1] just to cite a few applications. The singly-parametric α-divergences have also been generalized to bi-parametric families of divergences like the (α, β)-divergences [2] or the αβ-divergences [37].…”
Section: Introduction 1statistical Divergencesmentioning
confidence: 99%
“…Concerning the fMRI framework, besides K-SVD, the most commonly used DL algorithm for analyzing fMRI data is the Online Dictionary Learning (ODL) algorithm [96]. Despite its relative simplicity, ODL constitutes a very efficient algorithm that has been successfully applied to reconstruct current brain networks from both task-related and rs-fMRI [1,82,127,75,104]. Furthermore, ODL often acts as a reference baseline for the development of novel advanced DL algorithms [155].…”
Section: Conventional Algorithms and Alternativesmentioning
confidence: 99%
“…Therefore, it is not surprising to find out that, during the last years, many different DL algorithms and methods have been developed oriented to analyze fMRI data. For example, there exist DL methods particularly tailored to the fMRI data, such as [82], or another more recent and sophisticated approach adapted to the natural noise present in the fMRI data, such as [75].…”
Section: Conventional Algorithms and Alternativesmentioning
confidence: 99%
See 1 more Smart Citation