2012
DOI: 10.1016/j.eswa.2012.03.015
|View full text |Cite
|
Sign up to set email alerts
|

Comparing the dimensionality reduction methods in gene expression databases

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
12
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(13 citation statements)
references
References 12 publications
1
12
0
Order By: Relevance
“…It is important to note that the attribute selection results presented, concerning the gene expression domain by attribute selection, are from the research of Borges and Nievola (2012). The three chosen databases were subjected to the seven classifiers.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…It is important to note that the attribute selection results presented, concerning the gene expression domain by attribute selection, are from the research of Borges and Nievola (2012). The three chosen databases were subjected to the seven classifiers.…”
Section: Resultsmentioning
confidence: 99%
“…These sets have been used by Borges and Nievola (2012) in their studies. The three data sets were extracted from the Kent Ridge Bio-medical Dataset Repository.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…Unsupervised methods try to keep the structure information of data in its low-dimension representation, for instance, of the two typical unsupervised methods, principal component analysis (PCA) attempts to discover the variance of the data [2] while locality preserving projection (LPP) aims at preserving the locality relation [3]. Supervised methods, such as linear discriminate analysis (LDA) [4] and maximum margin criterion (MMC) [5], make use of the labeled data to obtain the reduced low-dimensional features containing the most discriminative information. Basically, supervised methods outperform unsupervised ones when labeled data are suf cient.…”
Section: Introductionmentioning
confidence: 99%