CVPR 2011 2011
DOI: 10.1109/cvpr.2011.5995492
|View full text |Cite
|
Sign up to set email alerts
|

Non-negative matrix factorization as a feature selection tool for maximum margin classifiers

Abstract: Non-negative matrix factorization (NMF) has previously been shown to be a useful decomposition tool for multivariate data. Non-negative bases allow strictly additive combinations which have been shown to be part-based as well as relatively sparse. We pursue a discriminative decomposition by coupling NMF objective with a maximum margin classifier, specifically a support vector machine (SVM). Conversely, we propose an NMF based regularizer for SVM. We formulate the joint update equations and propose a new method… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 33 publications
(15 citation statements)
references
References 20 publications
0
15
0
Order By: Relevance
“…1: Generate cost function a(r) for binary-class, multi-label, multi-class tasks according to Eqs. (6), (7) and (8), respectively. 2: Calculate cost vector c i ∈ R n (1 ≤ i ≤ m) for the i-th class according to per-class FN i /FP i cost in a(r).…”
Section: Convergence Analysismentioning
confidence: 99%
“…1: Generate cost function a(r) for binary-class, multi-label, multi-class tasks according to Eqs. (6), (7) and (8), respectively. 2: Calculate cost vector c i ∈ R n (1 ≤ i ≤ m) for the i-th class according to per-class FN i /FP i cost in a(r).…”
Section: Convergence Analysismentioning
confidence: 99%
“…Since our algorithms combine feature selection, multi-kernel learning, graph regularization and NMF, we compared our algorithms with the following relevant methods: the original NMF (Lee & Seung, 2000), the graph-regularized NMF (GNMF) (Cai et al, 2011), the kernel NMF (NMF K ) (Lee, Cichocki, & Choi, 2009), the NMF with multi-kernels (NMF MK ) and the NMF with feature selection (NMF FS ) (Das Gupta & Xiao, 2011). In total seven different methods were compared on this data set.…”
Section: Resultsmentioning
confidence: 99%
“…However, simultaneously achieving feature learning and learning to rank has not been reported in the literature. The joint optimization principle is introduced in [9], [36]. Gupta et al [9] pursue a discriminative decomposition by coupling NMF objective with a maximum margin classifier.…”
Section: Related Workmentioning
confidence: 99%
“…The joint optimization principle is introduced in [9], [36]. Gupta et al [9] pursue a discriminative decomposition by coupling NMF objective with a maximum margin classifier. Fang et al [36] present a new classification framework using the multi-label correlation information to address the problem of simultaneously combining multiple feature views and maximum margin classification.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation