2010
DOI: 10.1109/tpami.2008.277
|View full text |Cite
|
Sign up to set email alerts
|

Convex and Semi-Nonnegative Matrix Factorizations

Abstract: We present several new variations on the theme of nonnegative matrix factorization (NMF). Considering factorizations of the form X = F G T , we focus on algorithms in which G is restricted to contain nonnegative entries, but allow the data matrix X to have mixed signs, thus extending the applicable range of NMF methods. We also consider algorithms in which the basis vectors of F are constrained to be convex combinations of the data points. This is used for a kernel extension of NMF. We provide algorithms for c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

8
819
0

Year Published

2012
2012
2021
2021

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 1,121 publications
(827 citation statements)
references
References 25 publications
8
819
0
Order By: Relevance
“…The results are highly competitive with other methods. For Euclidean distance, although our algorithm's sparsity is only less than cNMF [18] (Figure 2), it has lower information loss and higher performance in classification. In addition, especially for KL-divergence, our approach retains the best sparse solutions (Figure 3), while it still has the best result for the other measures.…”
Section: A Interpretationmentioning
confidence: 95%
See 1 more Smart Citation
“…The results are highly competitive with other methods. For Euclidean distance, although our algorithm's sparsity is only less than cNMF [18] (Figure 2), it has lower information loss and higher performance in classification. In addition, especially for KL-divergence, our approach retains the best sparse solutions (Figure 3), while it still has the best result for the other measures.…”
Section: A Interpretationmentioning
confidence: 95%
“…More particularly, our algorithm for Euclidean distance is compared to NMF [3], spNMF [17], oNMF [8], and cNMF [18], while the other one with KL-divergence is compared to kl-NMF [3], local non-negative matrix factorization (locNMF) [5], convolutional NMF(conNMF) [19], and Nonsmooth Nonnegative Matrix Factorization (nsNMF) [7]. The implemented codes are at http://www.ee.columbia.edu/ grindlay/code.html.…”
Section: Methodsmentioning
confidence: 99%
“…23 Convex non-negative matrix factorization (convex-NMF) 24 is a variant of NMF that imposes a restriction over the source matrix S to be a convex combination of the input data vectors. This restriction significantly improves the quality of data representation of S. Unlike standard NMF, convex-NMF applies to both non-negative and mixed-sign data matrices.…”
Section: Non-negative Matrix Factorization (Nmf)mentioning
confidence: 99%
“…Since the focus of this paper is validating the advantages of the deep dictionary learning framework, for simplicity, the dictionaries are learned by the convex semi-nonnegative matrix factorization [6], where the decomposition coefficient C i contains only non-negative elements. Noting the role of C i in representing the probability(abundance) that each voxel belongs to certain material types(end-member), this constraint is reasonable.…”
Section: Hierarchical Dictionary Learningmentioning
confidence: 99%