2010
DOI: 10.1007/978-3-642-05179-1_17
|View full text |Cite
|
Sign up to set email alerts
|

Monte Carlo Feature Selection and Interdependency Discovery in Supervised Classification

Abstract: Applications of machine learning techniques in Life Sciences are the main applications forcing a paradigm shift in the way these techniques are used. Rather than obtaining the best possible supervised classifier, the Life Scientist needs to know which features contribute best to classifying observations into distinct classes and what are the interdependencies between the features. To this end we significantly extend our earlier work [Dramiński et al. (2008)] that introduced an effective and reliable method for… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
27
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 24 publications
(28 citation statements)
references
References 27 publications
1
27
0
Order By: Relevance
“…Since the random binary features with low or high indices were defined to have an excess of ones or zeroes, respectively, this corresponds to a weak preference for features with a uniform distribution of values. However, no relation between the value range of a feature and its relative importance was observed, consistent with previously reported results [11], although the variation of the RIs increased slightly with the value range (Fig. 2b).…”
Section: Results Of Simulation Studysupporting
confidence: 93%
See 2 more Smart Citations
“…Since the random binary features with low or high indices were defined to have an excess of ones or zeroes, respectively, this corresponds to a weak preference for features with a uniform distribution of values. However, no relation between the value range of a feature and its relative importance was observed, consistent with previously reported results [11], although the variation of the RIs increased slightly with the value range (Fig. 2b).…”
Section: Results Of Simulation Studysupporting
confidence: 93%
“…The use of MCFS was originally illustrated by selecting genes with importance for leukemia and lymphoma [3], and it was later used to study e.g. HIV-1 by selecting residues in the amino acid sequence of reverse transcriptase with importance for drug resistance [4,5]. Furthermore, MCFS may be used to rank the features based on their relative importance score.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Another worth-noticing example of a multivariate feature ranker is the Breiman's relevance measure. It expresses the average increase of a classification error resulting from randomization of attributes that were used during construction of trees by the Random Forest algorithm [140,141].…”
Section: Examples Of Similarity Learning Modelsmentioning
confidence: 99%
“…Many different feature selection methods have been proposed, cf. for example [1], [2], [3], and [4].…”
Section: Preliminary Remarksmentioning
confidence: 99%