2006
DOI: 10.1007/11679363_5
|View full text |Cite
|
Sign up to set email alerts
|

Csiszár’s Divergences for Non-negative Matrix Factorization: Family of New Algorithms

Abstract: Abstract. In this paper we discus a wide class of loss (cost) functions for non-negative matrix factorization (NMF) and derive several novel algorithms with improved efficiency and robustness to noise and outliers. We review several approaches which allow us to obtain generalized forms of multiplicative NMF algorithms and unify some existing algorithms. We give also the flexible and relaxed form of the NMF algorithms to increase convergence speed and impose some desired constraints such as sparsity and smoothn… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
155
0

Year Published

2006
2006
2021
2021

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 195 publications
(158 citation statements)
references
References 8 publications
3
155
0
Order By: Relevance
“…In addition to the use of different objective functions such as the least squares [4] and Kullback Leibler [13], the main difference among various algorithms lies in the update rule. The update rule directly influences the convergence speed and the quality of the factorization.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In addition to the use of different objective functions such as the least squares [4] and Kullback Leibler [13], the main difference among various algorithms lies in the update rule. The update rule directly influences the convergence speed and the quality of the factorization.…”
Section: Related Workmentioning
confidence: 99%
“…Additionally, the basis can be constrained to be sparse which typically leads to an even more meaningful decomposition of the data. As a result, many researchers focused on sparse non-negative matrix factorization (SNMF) [13,14,4,9] in the past few years.…”
Section: Introductionmentioning
confidence: 99%
“…Other BSS approaches that can deal with statistically dependent sources include: independent subspace analysis (ISA) [24][25], nonnegative matrix and tensor factorization (NMF/NTF) [27][28][29][30], and the blind Richardson-Lucy (BRL) algorithm [33][34][35][36], which are used for comparison purpose in this paper. They are briefly described as follows.…”
Section: Algorithms For Comparisonmentioning
confidence: 99%
“…NMF/NTF algorithms may yield physically useful solutions by imposing the nonnegativity, sparseness or smoothness constraints on the sources [27][28][29][30]. In [27], the NMF algorithm was first derived to minimize two cost functions: the squared Euclidean distance and the Kullback-Leibler divergence.…”
Section: Nonnegative Matrix and Tensor Factorizationmentioning
confidence: 99%
See 1 more Smart Citation