2015
DOI: 10.1016/j.knosys.2015.05.015
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic selection of the best base classifier in One versus One

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
11
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(12 citation statements)
references
References 31 publications
1
11
0
Order By: Relevance
“…Once the data have been processed, the previously explained CSP algorithm is performed. The used CSP method is implemented to work with just two classes, therefore all the tests have been carried out using pairs of classes, although multiclass classification is possible using pairwise classification approaches, such as One versus One (OVO) as a class binarization technique [50].…”
Section: Resultsmentioning
confidence: 99%
“…Once the data have been processed, the previously explained CSP algorithm is performed. The used CSP method is implemented to work with just two classes, therefore all the tests have been carried out using pairs of classes, although multiclass classification is possible using pairwise classification approaches, such as One versus One (OVO) as a class binarization technique [50].…”
Section: Resultsmentioning
confidence: 99%
“…In this regard, ensemble learning approach consisting of three classifiers including Support vector machine(SVM), Artificial neural network(ANN) and Gradient boosting Decision trees(XGBoost) has been used where an output score is produced according to ensemble classifier rule based on input scores of all three classifiers as shown in Fig. 9 The proposed system employs two types of ensemble rules including dynamic classifier selection(DCS) [38] and weighted classifier fusion(WCF) [39]. DCS reflects the tendency to extract a single best classifier at train-test split for each action which is the most likely to produce the correct classification label for an input sample at validation split.…”
Section: System Architecturementioning
confidence: 99%
“…These methods are based on the classification accuracy of the local neighborhood of the test sample, where the neighborhood is defined by the k nearest neighbors (KNN) algorithm [15] or the clustering algorithm [16]. For example, the overall local accuracy (OLA) selects the optimal classifier based on the accuracy of the classifier in the local neighborhood [17]. Another method is the local class accuracy (LCA), which uses posteriori information to calculate the performance of the base classifier for particular classes [18].…”
Section: Classification Accuracy Based On Local Neighborhoodmentioning
confidence: 99%