2020
DOI: 10.1007/s11042-020-09766-w
|View full text |Cite
|
Sign up to set email alerts
|

Deep semi-nonnegative matrix factorization with elastic preserving for data representation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(9 citation statements)
references
References 21 publications
0
9
0
Order By: Relevance
“…Thus, each image is represented by a 1024-dimensional vector. Similarly, the COIL100 [24] image database has the same settings, which contains 7200 images of 100 objects.…”
Section: A Datasets Descriptionmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, each image is represented by a 1024-dimensional vector. Similarly, the COIL100 [24] image database has the same settings, which contains 7200 images of 100 objects.…”
Section: A Datasets Descriptionmentioning
confidence: 99%
“…Moreover, Qu et al [23] introduced a Graph Regularized Deep semi-Nonnegative Matrix Factorization (GR Deep semi-NMF) algorithm, which preserves the geometric structure information of the data while learning high-level feature representations. Meanwhile, Shu et al [24] proposed a Deep Semi-Nonnegative Matrix Factorization with Elastic Preserving (Deep Semi-NMF-EP) algorithm for data representation, which can effectively preserve the elasticity of data and learn a better representation of high-dimensional data. After that, Yu et al [25] presented a Deep Non-Smooth Nonnegative Matrix Factorization (Deep nsNMF) algorithm, which not only gives the parts-based features due to the nonnegativity constraints but also creates higher-level, more abstract features by combing lower-level ones.…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, deep learning has exhibited outstanding performance in feature representation tasks [18][19][20]. Therefore, many researchers have introduced deep learning into matrix factorization and proposed a large number of deep feature representation methods [21][22][23][24][25][26][27]. Ahn et al [21] proposed multilayer nonnegative matrix factorization (MNMF).…”
Section: Introductionmentioning
confidence: 99%
“…In recent years, researchers found that the relationships between data in real applications usually show high dimension nonlinear, so the aforementioned linear representation approaches can hardly achieve good performance. Many researchers paid more attention on revealing the nonlinear relationship between data points of interests [16][17][18][19][20][21][22][23][24][25][26][27]. For example, Wang et al [28] explored the criterion of Locally Linear Embedding (LLE) and used it to construct the graph by computing the weights between the pairs of samples.…”
Section: Introductionmentioning
confidence: 99%