2014
DOI: 10.1186/1471-2105-15-70
|View full text |Cite
|
Sign up to set email alerts
|

Feature weight estimation for gene selection: a local hyperlinear learning approach

Abstract: BackgroundModeling high-dimensional data involving thousands of variables is particularly important for gene expression profiling experiments, nevertheless,it remains a challenging task. One of the challenges is to implement an effective method for selecting a small set of relevant genes, buried in high-dimensional irrelevant noises. RELIEF is a popular and widely used approach for feature selection owing to its low computational cost and high accuracy. However, RELIEF based methods suffer from instability, es… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 37 publications
(23 citation statements)
references
References 31 publications
0
23
0
Order By: Relevance
“…SVM-RCE-EC demonstrated improved classification accuracy compared to other methods tested (SVM-RFE [15], SVM-RCE [4], mRMR [16], IMRelief [17], SlimPLS [18] and SMKL-FS [12]) reported by the study of Du et al [12] and this was particularly true for data sets where the 2 classes were difficult to separate and where other methods either failed or had low performance.…”
Section: Resultsmentioning
confidence: 65%
See 1 more Smart Citation
“…SVM-RCE-EC demonstrated improved classification accuracy compared to other methods tested (SVM-RFE [15], SVM-RCE [4], mRMR [16], IMRelief [17], SlimPLS [18] and SMKL-FS [12]) reported by the study of Du et al [12] and this was particularly true for data sets where the 2 classes were difficult to separate and where other methods either failed or had low performance.…”
Section: Resultsmentioning
confidence: 65%
“…The SVM-RCE-EC approach was compared to 8 previously reported methods [12] including: SVM-RFE [15], SVM-RCE [4], mRMR [16], IMRelief [17], SlimPLS [18] and SMKL-FS [12] using 10-fold CrossValidation (CV) on a variety of datasets. The performance is calculated as the mean effectiveness measurement [12].…”
Section: Comparing Results Using Svm-rce-ec and Different Classificatmentioning
confidence: 99%
“…A feature selection method reported recently, called LHR, uses a highly diagnostic yet compact feature subset [49]. The five features discovered include ADC, Sum Average, Entropy, Elongation and Sum Variance.…”
Section: Resultsmentioning
confidence: 99%
“…The procedure works well on selected genes with higher correlation coefficients based on symmetrical uncertainty. Alternatively, Cai et al [2] initiated a feature weighting algorithm for gene selection called LHR. LHR estimates the feature weights through local approximation based on ReliefF.…”
Section: Introductionmentioning
confidence: 99%