2011
DOI: 10.1007/978-3-642-24958-7_28
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Feature Weighting Methods Based on Feature Ranking Methods for Classification

Abstract: Abstract. We propose and analyze new fast feature weighting algorithms based on different types of feature ranking. Feature weighting may be much faster than feature selection because there is no need to find cut-threshold in the raking. Presented weighting schemes may be combined with several distance based classifiers like SVM, kNN or RBF network (and not only). Results shows that such method can be successfully used with classifiers.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 14 publications
0
9
0
Order By: Relevance
“…The importance of attribute weighting in improving classification performance has been analysed by numerous works (Jankowski and Usowicz 2011;Kohavi et al 1997). Several authors (Leopold and Kindermann 2002;Lan et al 2005) have also stated that the selection of an adequate weighting function is even more important than the parameterisation of the kernel in SVM algorithms.…”
Section: Tag Weightingmentioning
confidence: 99%
“…The importance of attribute weighting in improving classification performance has been analysed by numerous works (Jankowski and Usowicz 2011;Kohavi et al 1997). Several authors (Leopold and Kindermann 2002;Lan et al 2005) have also stated that the selection of an adequate weighting function is even more important than the parameterisation of the kernel in SVM algorithms.…”
Section: Tag Weightingmentioning
confidence: 99%
“…Several feature weight methods such as term frequency methods [12], feature rank methods [13], and self-adjustment methods [14,15] have been proposed and utilized in practice. Choosing appropriate feature weights can improve classification.…”
Section: Feature Weightsmentioning
confidence: 99%
“…Forty-four out of fortyfive experiments demonstrated that the proposed weighting process achieved a good performance on a dataset that included many irrelevant features. In [13], a new fast feature weighting algorithm was proposed based on feature ranking methods. The ranking methods involved correlations, information theory, and distance between probability distributions.…”
Section: Feature Weightsmentioning
confidence: 99%
See 1 more Smart Citation
“…Feature weighting is broadly used in batch learning [25,3] to assign different weights to feature accordingly to their relevance to the concept to be learned and improve prediction accuracy. As shown earlier, in opposition to static scenarios, the relevance of features may augment or diminish during a data stream, thus, techniques for detecting these changes are needed.…”
Section: Dynamic Feature Weightingmentioning
confidence: 99%