Proceedings of the 27th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval 2004
DOI: 10.1145/1008992.1009034
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection using linear classifier weights

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
91
1
1

Year Published

2004
2004
2016
2016

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 158 publications
(94 citation statements)
references
References 6 publications
1
91
1
1
Order By: Relevance
“…A weight-based ranking was applied (Mladeni et al, 2004). Vector elements associated with small magnitude weights |Ŵ| are relatively unimportant for classification; where |Ŵ| the absolute value of is |Ŵ| normalized to the observed maximum.…”
Section: Feature Reductionmentioning
confidence: 99%
“…A weight-based ranking was applied (Mladeni et al, 2004). Vector elements associated with small magnitude weights |Ŵ| are relatively unimportant for classification; where |Ŵ| the absolute value of is |Ŵ| normalized to the observed maximum.…”
Section: Feature Reductionmentioning
confidence: 99%
“…Do's method uses Daubechies 4-tap wavelets and a pyramidal transform; the same weight is assigned to each subband and the low-frequency subband is discarded. The proposed method is also compared to a retrieval method based on Zernike moments (Khotanzad and Hong, 1990), in which the influence of each moment in the distance measure is weighted using an SVM (Mladenić et al, 2004). The results are reported on figure 12.…”
Section: Resultsmentioning
confidence: 99%
“…Note that F is the set of all features taken into account. There are numerous feature weighting methods to assign weight to features like information gain [7], weights from a linear classifier [15], odds ratio, etc. Herein, we consider two well-known weighting schemas.…”
Section: B Feature Selection and Feature Weightingmentioning
confidence: 99%