2019
DOI: 10.1016/j.patrec.2019.07.005
|View full text |Cite
|
Sign up to set email alerts
|

Regularized asymmetric nonnegative matrix factorization for clustering in directed networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 19 publications
(4 citation statements)
references
References 24 publications
0
4
0
Order By: Relevance
“…In 2019, the FANMF method was introduced by Tosyali [ 33 ]. The main purpose of the FANMF method is to handle nonnegative data that have an asymmetric nature.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…In 2019, the FANMF method was introduced by Tosyali [ 33 ]. The main purpose of the FANMF method is to handle nonnegative data that have an asymmetric nature.…”
Section: Methodsmentioning
confidence: 99%
“…By incorporating the graph information, the recommendation accuracy is improved by leveraging the connectivity and relationships among users and items. In 2019, factorized asymmetric nonnegative matrix factorization (FANMF) was introduced by Tosyali et al by considering the asymmetric relationships between users and items [ 33 ]. It led to enhanced recommendation quality, a deeper understanding of user preferences, and ultimately provided personalized user experiences in various data analysis tasks.…”
Section: Literature Reviewmentioning
confidence: 99%
“…For example, Wang et al (2011) propose the asymmetric non-negative matrix factorization (Asymmetric NMF, ANMF), which decomposes the asymmetric adjacency matrix of the directed graph into non-negative matrix, and then divides the nodes into clusters based on the membership matrix. On this basis, Tosyali et al (2019) propose Regularized Asymmetric NMF (RANMF) and improve the accuracy and robustness of directed graph clustering. One of the most successful applications of NMF is in the field of computer vision.…”
Section: Non-negative Matrix Factorizationmentioning
confidence: 99%
“…These existing methods include the projected gradient method [34,52], the interior point method [37], the projected quasi-Newton method [22,51], the active-set method [3,46,24,31], the activeset-like method [27,28], the alternating nonnegative least squares based on block principal pivoting (ANLS-BPP) method [28]. There also exist many variants of NMF (1.1) that add constraints and/or penalty terms on U and V for better interpretation and representation of the characteristics of the tasks [1,48] including sparse NMF [9,44,23], orthogonal NMF [33], semi-NMF [42], Joint NMF [14], nonnegative tensor factorization [7,9,25,26,43], manifold NMF [49], kernel NMF [53], regularized NMF [44,47], Symmetric NMF [19,45], integer constrained [12], and so on.…”
mentioning
confidence: 99%