2019
DOI: 10.1109/access.2019.2924140
|View full text |Cite
|
Sign up to set email alerts
|

Sparse General Non-Negative Matrix Factorization Based on Left Semi-Tensor Product

Abstract: The dimension reduction of large scale high-dimensional data is a challenging task, especially the dimension reduction of face data and the accuracy increment of face recognition in the large scale face recognition system, which may cause large storage space and long recognition time. In order to further reduce the recognition time and the storage space in the large scale face recognition systems, on the basis of the general non-negative matrix factorization based on left semi-tensor (GNMFL) without dimension … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 33 publications
0
4
0
Order By: Relevance
“…From equations (13) to (16) we obtain the input X' and output Y' values to feed the neural network training. This is the proposed long model to arrange data in order to make RS classification and then RS recommendation.…”
Section: B Formalization and Data-toy Examplementioning
confidence: 99%
See 1 more Smart Citation
“…From equations (13) to (16) we obtain the input X' and output Y' values to feed the neural network training. This is the proposed long model to arrange data in order to make RS classification and then RS recommendation.…”
Section: B Formalization and Data-toy Examplementioning
confidence: 99%
“…Matrix Factorization (MF) is the most implemented approach, since it provides accurate recommendations, it is easy to understand, and it obtains a good performance. There are several MF variations such as PMF [13] [14], BNMF [15], BPR [16] and eALS [17]. MF extracts the complex relations between items and users and codes them into a reduced number of hidden factors.…”
Section: Introductionmentioning
confidence: 99%
“…Predictions to each user can be computed by making the dot product of the user and the items factors. Well known MF variations are Positive Matrix Factorization (PMF) [16], [17], Bayesian Nonnegative Matrix Factorization (BNMF) [18], and Elementwise Alternating Least Squares (eALS) [19]. Finally, state of the art CF RS implementations make use of neural networks [76], some of them joining MF and the Multilayer Perceptron [20].…”
Section: A Recommendations To Individual Usersmentioning
confidence: 99%
“…Let σ f be the set of factors variances (14) Let be the factor f variance (15) Let θ be the threshold of required accumulated variance (16) Let T be the set of factors that hold the required accumulated variance (17) To fix the proposed method concepts we provide a data toy example. Fig.…”
mentioning
confidence: 99%