2014
DOI: 10.48550/arxiv.1412.1788
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Primal-Dual Algorithms for Non-negative Matrix Factorization with the Kullback-Leibler Divergence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2016
2016

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…( 11). The second model is NMF using KL divergence, labeled NMF [16]. The last model, GNMF [10] described in Sec.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…( 11). The second model is NMF using KL divergence, labeled NMF [16]. The last model, GNMF [10] described in Sec.…”
Section: Resultsmentioning
confidence: 99%
“…where • is the pointwise multiplication operator and θ A , θ B ∈ R + . We use a weighted Kullback-Leibler (KL) divergence as a distance measure between C and AB, that has been shown to be more accurate than the Frobenius norm for various NMF settings [16]. The second term is the TV of the rows of A on the playlists graph, so penalizing it promotes piecewise constant signals [11].…”
Section: Our Recommendation Algorithmmentioning
confidence: 99%
See 2 more Smart Citations