2015
DOI: 10.1016/j.neucom.2014.08.095
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical feature extraction by multi-layer non-negative matrix factorization network for classification task

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
19
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 42 publications
(19 citation statements)
references
References 13 publications
0
19
0
Order By: Relevance
“…The experiments for the face recognition rate with different values of r based on threshold SNMF [24], CNMF [21], multi-layer NMF [22], of multiplicative iterative rules, PCA [12] and the SWNMF based on the new additive iteration rules proposed in this paper are executed on the JAFEE database and ORL database, respectively. The results of comparison experiments are shown in Figures 3-6.…”
Section: Comparision Of Swnmf With Multiple Iteration Nmf Methods Andmentioning
confidence: 99%
See 2 more Smart Citations
“…The experiments for the face recognition rate with different values of r based on threshold SNMF [24], CNMF [21], multi-layer NMF [22], of multiplicative iterative rules, PCA [12] and the SWNMF based on the new additive iteration rules proposed in this paper are executed on the JAFEE database and ORL database, respectively. The results of comparison experiments are shown in Figures 3-6.…”
Section: Comparision Of Swnmf With Multiple Iteration Nmf Methods Andmentioning
confidence: 99%
“…Therefore, it has been widely used in image, voice, video and other non-negative datasets to extract the features [18,19]. In recent years, many improved NMF algorithms have been proposed, such as SNMF (sparse nonnegative matrix factorization) method [20], the CNMF (convex nonnegative matrix factorization) method [21], and the multi-layer NMF method [22]. The SNMF method considers the redundant information hidden in the complex data by adding the sparse constraints in the iteration rule.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…where λ l e T l e l H and λ l · e T l e l 2 in (24) are removed if l = L, meaning that we just controled the sparsity of H L to obtain a high-level final presentation. After pre-training each layer separately, we fine-tuned the weights of all layers as well as the final representation with initial approximation points of W l , H L to reduce the total reconstruction error of (21). In particular, for a specific l and a factor matrix W l , let's consider the model in (15) where, W :…”
Section: Convergence Analysismentioning
confidence: 99%
“…Cichocki et al propose a multilayer NMF algorithm with multi-start initializations, which is also a deep factorization of the coefficient matrix [30], [31]. Song et al extended the nsNMF algorithm that can learn sparse features to an algorithm with a multilayered structure [32]. In 2017, Trigeorgis et al extended the SN-MF algorithm to a multilayer structure and proposed the deep semi NMF (DSNMF) algorithm [33].…”
Section: Introductionmentioning
confidence: 99%