2009 3rd International Conference on Signals, Circuits and Systems (SCS) 2009
DOI: 10.1109/icscs.2009.5412341
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection using an SVM learning machine

Abstract: In this paper we suggest an approach to select features for the Support Vector Machines (SVM). Feature selection is efficient in searching the most descriptive features which would contribute in increasing the effectiveness of the classifier algorithm. The process described here consists in backward elimination strategy based on the criterion of the rate of misclassification. We used the tabu algorithm to guide the search of the optimal set of features; each set of features is assessed according to its goodnes… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2010
2010
2022
2022

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 4 publications
0
4
0
Order By: Relevance
“…SVMs have been extensively used as a classification tool with associated learning algorithms to analyze data and recognize patterns [11]. Besides high classification accuracy, SVMs also have a number of advantages: firstly, SVM has a superior generalization capability for small training sets [12].…”
Section: A Data Analysis Methodsmentioning
confidence: 99%
“…SVMs have been extensively used as a classification tool with associated learning algorithms to analyze data and recognize patterns [11]. Besides high classification accuracy, SVMs also have a number of advantages: firstly, SVM has a superior generalization capability for small training sets [12].…”
Section: A Data Analysis Methodsmentioning
confidence: 99%
“…The main objective of the feature selection can be described by [18]. However, the classification error can be increased by the elimination of certain very relevant informations, considering this information if they are used can prove to be informative [19]. Our goal is to design efficient algorithms to select a solid set of pertinent features.…”
Section: Feature Selection and Validationmentioning
confidence: 99%
“…Two typical approaches exist to achieve this objective: feature selection and feature extraction. Feature selection methods search for the optimal subset of original features [8] [9].…”
Section: Introductionmentioning
confidence: 99%