2020
DOI: 10.1007/s40314-020-1091-2
|View full text |Cite
|
Sign up to set email alerts
|

Orthogonal incremental non-negative matrix factorization algorithm and its application in image classification

Abstract: To improve the sparseness of the base matrix in incremental non-negative matrix factorization, we in this paper present a new method, orthogonal incremental non-negative matrix factorization algorithm (OINMF), which combines the orthogonality constraint with incremental learning. OINMF adopts batch update in the process of incremental learning, and its iterative formulae are obtained using the gradient on the Stiefel manifold. The experiments on image classification show that the proposed method achieves much … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…The authors use NMF for dimensionality reduction to simplify the data and the relations in the data. To improve the sparseness of the base matrix in incremental NMF, the authors of [34] present a new method, orthogonal incremental NMF algorithm, which combines the orthogonality constraint with incremental learning. This approach adopts batch update in the process of incremental learning.…”
Section: Related Workmentioning
confidence: 99%
“…The authors use NMF for dimensionality reduction to simplify the data and the relations in the data. To improve the sparseness of the base matrix in incremental NMF, the authors of [34] present a new method, orthogonal incremental NMF algorithm, which combines the orthogonality constraint with incremental learning. This approach adopts batch update in the process of incremental learning.…”
Section: Related Workmentioning
confidence: 99%
“…The incremental data challenge can be solved by incremental learning. Incremental learning means that a learning system can constantly learn new knowledge from new samples and can save most of the previously known knowledge [5]. However, the traditional federated learning method, which does not consider the weight of incremental data, largely depends on the repetition of the training process, and even leads to the severe decline of global ML model accuracy and ML model deviation.…”
Section: Introductionmentioning
confidence: 99%