2012
DOI: 10.1109/tip.2012.2190081
|View full text |Cite
|
Sign up to set email alerts
|

A Convex Model for Nonnegative Matrix Factorization and Dimensionality Reduction on Physical Space

Abstract: A collaborative convex framework for factoring a data matrix X into a nonnegative product AS , with a sparse coefficient matrix S, is proposed. We restrict the columns of the dictionary matrix A to coincide with certain columns of the data matrix X, thereby guaranteeing a physically meaningful dictionary and dimensionality reduction. We use l(1, ∞) regularization to select the dictionary from the data and show that this leads to an exact convex relaxation of l(0) in the case of distinct noise-free d… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
167
0
2

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 145 publications
(169 citation statements)
references
References 28 publications
0
167
0
2
Order By: Relevance
“…It is also related to NMF, to be described in the next section. In addition, there has been interest in using the measured data Y itself as the dictionary for MMV [62]. This self-dictionary MMV (SD-MMV) approach is related to pure pixel search.…”
Section: Further Discussionmentioning
confidence: 99%
“…It is also related to NMF, to be described in the next section. In addition, there has been interest in using the measured data Y itself as the dictionary for MMV [62]. This self-dictionary MMV (SD-MMV) approach is related to pure pixel search.…”
Section: Further Discussionmentioning
confidence: 99%
“…However, the lack of physical interpretation of the compact dictionary (i.e., physical meaning of each basis in the dictionary) has been a critical shortcoming of the standard dictionary learning techniques. Inspired by the method in [26] based on non-negative matrix factorization, in this paper, we address this issue by proposing a novel probabilistic approach for selecting a group of features as a compact representation of all the features in the training set. We believe that nothing is more meaningful for representing the data than the data itself.…”
Section: Compact Representation Of the Featuresmentioning
confidence: 99%
“…The convex optimization problem in Eq. 2 is solved using the Alternating Direction Method of Multipliers framework in [29] (more details about the optimization steps can be found in [26]). …”
Section: Compact Representation Of the Featuresmentioning
confidence: 99%
“…We note also that in hyperspectral imaging, there is a common task called linear unmixing (or factorization into physical space) which consists in detecting the spectra of the pure materials and estimating their relative abundances. Linear unmixing can be seen as an equivalent problem to NMF (Esser et al, 2012).…”
Section: Maps P(e) → P(e); or On Grey-level Images Ie Maps F(e Tmentioning
confidence: 99%