2017
DOI: 10.1016/j.neucom.2016.11.026
|View full text |Cite
|
Sign up to set email alerts
|

Wavelet twin support vector machines based on glowworm swarm optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 60 publications
(18 citation statements)
references
References 41 publications
0
18
0
Order By: Relevance
“…The SI applications in those two aspects are mainly related to parameter tuning. For classification, works can be found in literature that combine SI algorithms with regression model [ 53 ], support vector machine [ 7 , 14 , 60 ], k-nearest neighbor classifiers [ 58 , 65 ], Decision trees [ 3 , 35 ], as well as the neural networks [ 30 , 62 ]. For clustering, some recent works are related to utilizing SI with k-means [ 28 , 59 , 61 ], c-means [ 21 ], and other linear or non-linear clustering algorithms [ 19 , 27 ].…”
Section: Theoretical Applicationsmentioning
confidence: 99%
“…The SI applications in those two aspects are mainly related to parameter tuning. For classification, works can be found in literature that combine SI algorithms with regression model [ 53 ], support vector machine [ 7 , 14 , 60 ], k-nearest neighbor classifiers [ 58 , 65 ], Decision trees [ 3 , 35 ], as well as the neural networks [ 30 , 62 ]. For clustering, some recent works are related to utilizing SI with k-means [ 28 , 59 , 61 ], c-means [ 21 ], and other linear or non-linear clustering algorithms [ 19 , 27 ].…”
Section: Theoretical Applicationsmentioning
confidence: 99%
“…PSO is a searching model based on the random orbit and it is likely to have a tendency to converge towards local optima or even arbitrary points rather than the global optimum. Glowworm swarm optimisation (GSO) [13] has the advantages of strong versatility and generality. On the basis of the PSO, in 2004, QPSO was proposed by Sun et al [14] PSO is easy to fall into the local optimal solution, so the search result is not necessarily the global optimal solution [15].…”
Section: Basic Theorymentioning
confidence: 99%
“…where c 1 ≥ 0 and c 2 ≥ 0 are penalty parameters, ξ 1 and ξ 2 are slack variables, e 1 and e 2 are vectors with each element of the value of 1, s 1 and s 2 represent fuzzy membership of each type of sample points. By introducing the method of Lagrangian multipliers, the corresponding Wolfe dual of QPPs (18) and (19) can be represented as…”
Section: B Tbsvmmentioning
confidence: 99%
“…It makes training speed of TWSVM four times faster than SVM. From then on, various TWSVM methods have been widely investigated [11]- [23], such as least squares twin SVM (LSTSVM) [11], twin bounded SVM (TBSVM) [12], twin parametric-margin SVM (TPMSVM) [13], coordinate descent margin based twin SVM (CDMTSVM) [14], robust twin SVM (RTSVM) [15], nonparallel hyperplane SVM (NHSVM) [16], maximum margin of twin spheres support vector machine (MMTSSVM) [17], novel twin SVM (NTSVM) [18], wavelet twin SVM (WTWSVM) [19], angle-based twin SVM (ATSVM) [20], sparse pinball TSVM (SPTWSVM) [21], Pin-GTSVM [22] and improved universum TSVM (IUTSVM) [23]. In order to deal with complex XOR problem, inspired by multi-weight vector projection SVM (MVSVM) [24], Chen et al [25] proposed the projection twin SVM (PTSVM), which seeks a projection direction rather than a hyperplane for each class.…”
Section: Introductionmentioning
confidence: 99%