2012
DOI: 10.1007/978-3-642-34481-7_21
|View full text |Cite
|
Sign up to set email alerts
|

Improving Support Vector Machine Using a Stochastic Local Search for Classification in DataMining

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 5 publications
0
5
0
Order By: Relevance
“…Between HSA-SLS 2 , MA + SVM, SLS + SVM and GA + SVM Table 6 gives the average (Mean), the best (Max), the worst (Min) values of the classification accuracy, and the standard deviation (Sd) obtained by the HSA-SLS 2 , MA + SVM optimi zed [35], SLS + SVM [33,35] and GA + SVM [12]. As shown in Table 6, HSA-SLS 2 succeeds in finding the best results for almost the checked datasets compared to MA + SVM optimi zed , SLS + SVM and GA + SVM methods.…”
Section: Comparisonmentioning
confidence: 99%
See 2 more Smart Citations
“…Between HSA-SLS 2 , MA + SVM, SLS + SVM and GA + SVM Table 6 gives the average (Mean), the best (Max), the worst (Min) values of the classification accuracy, and the standard deviation (Sd) obtained by the HSA-SLS 2 , MA + SVM optimi zed [35], SLS + SVM [33,35] and GA + SVM [12]. As shown in Table 6, HSA-SLS 2 succeeds in finding the best results for almost the checked datasets compared to MA + SVM optimi zed , SLS + SVM and GA + SVM methods.…”
Section: Comparisonmentioning
confidence: 99%
“…The SLS is a local search meta-heuristic which has been already studied for several optimization problem such as the satisfiability problem and the optimal winner determination problem in combinatorial auctions [6,33]. The SLS starts with each individual x i belongs to Ω that selected by a probabilistic selection strategy from the HM for each iteration NI of the proposed HSA-SLS method.…”
Section: Refinement With Stochastic Local Searchmentioning
confidence: 99%
See 1 more Smart Citation
“…Several methods have been developed for feature selection. Among them, the stepwise forward selection, the stepwise backward elimination techniques, 8 the best-first search, 9 the stochastic local search (SLS), 10 the harmony search, 11 the genetic algorithms, 1,[12][13][14] the Memetic algorithm, 15 the particle swarm optimization, 16 and the hyper-heuristics. 17,18…”
Section: Feature Selectionmentioning
confidence: 99%
“…In information fusion algorithm, the features extracted from multi-sensor data may be redundant and contain false correlations, leading to high computational complexity, and low classification accuracy [11]. Thus, reducing redundancy of features is necessary and important in fusion algorithms.…”
Section: Introductionmentioning
confidence: 99%