2017
DOI: 10.1155/2017/4216797
|View full text |Cite
|
Sign up to set email alerts
|

Robust Nonnegative Matrix Factorization via Joint Graph Laplacian and Discriminative Information for Identifying Differentially Expressed Genes

Abstract: Differential expression plays an important role in cancer diagnosis and classification. In recent years, many methods have been used to identify differentially expressed genes. However, the recognition rate and reliability of gene selection still need to be improved. In this paper, a novel constrained method named robust nonnegative matrix factorization via joint graph Laplacian and discriminative information (GLD-RNMF) is proposed for identifying differentially expressed genes, in which manifold learning and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 17 publications
(6 citation statements)
references
References 44 publications
0
6
0
Order By: Relevance
“…In 2008, a study has shown that uncorrelated features will reduce the imputation efficiency as previous imputation method such as traditional KNN tends to bias toward outliers or uncorrelated features. As a consequence, the performance of KNN degrades, especially when the missing rate increases [9]. The paper proposes feature selection before imputation which is the modified KNN called KNN-based feature selection (KNN-FS) [10].…”
Section: Introductionmentioning
confidence: 99%
“…In 2008, a study has shown that uncorrelated features will reduce the imputation efficiency as previous imputation method such as traditional KNN tends to bias toward outliers or uncorrelated features. As a consequence, the performance of KNN degrades, especially when the missing rate increases [9]. The paper proposes feature selection before imputation which is the modified KNN called KNN-based feature selection (KNN-FS) [10].…”
Section: Introductionmentioning
confidence: 99%
“…In graph theory, the “manifold assumption” is that data points near local geometrical structures should keep their proximity under a new basis [35]. If we map the adjacent data points x i and x j in the high-dimensional space to the low-dimensional space, their mapping data points z i and z j should be close in the low-dimensional space.…”
Section: Methodsmentioning
confidence: 99%
“…Different from Ref. [5], it is a feature selection method. Then, we used k-means to cluster the samples based on the new data representations V .…”
Section: Experimental Configurationsmentioning
confidence: 99%