2020
DOI: 10.1109/access.2020.3023557
|View full text |Cite
|
Sign up to set email alerts
|

Symmetric Nonnegative Matrix Factorization Based on Box-Constrained Half-Quadratic Optimization

Abstract: Nonnegative Matrix Factorization (NMF) based on half-quadratic (HQ) functions was proven effective and robust when dealing with data contaminated by continuous occlusion according to the half-quadratic optimization theory. Nonetheless, state-of-the-art HQ NMF still cannot handle symmetric data matrices, and this caused problems when applications require processing symmetric matrices, e.g., similarity matrices. In view of such, this study presents HQ Symmetric NMF along with HQ optimization algorithms that can … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 34 publications
0
2
0
Order By: Relevance
“…The number of nodes contained in the largest community in the network WebKB, Cora, Citeseer and Pubmed [40]. The specific parameters of the entire data set are shown in Table 3.…”
Section: Datasetmentioning
confidence: 99%
“…The number of nodes contained in the largest community in the network WebKB, Cora, Citeseer and Pubmed [40]. The specific parameters of the entire data set are shown in Table 3.…”
Section: Datasetmentioning
confidence: 99%
“…Traditional supervised learning deals with the analysis of single-label data, which means that samples are associated with a single label. However, in many realworld data mining applications, such as text classification [1,2], scene classification [3,4], crowd sensing/mining [5][6][7][8][9][10][11], and gene functional classification [12,13], the samples are associated with more than one label. From this description, we understand that the challenge of the multilabel classification task is its potential output.…”
Section: Introductionmentioning
confidence: 99%