2015
DOI: 10.1016/j.neucom.2014.09.021
|View full text |Cite
|
Sign up to set email alerts
|

Enhancement of multi-class support vector machine construction from binary learners using generalization performance

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…As described in Algorithm 2, IB-DTree constructs a tree using a recursive procedure starting from the root node from lines 1-7 with all candidate classes in line 2. The node-adding procedure will be processed from lines [8][9][10][11][12][13][14][15][16][17][18][19]. First, the initial classifier h with the lowest entropy will be selected.…”
Section: The Information-based Decision Treementioning
confidence: 99%
See 1 more Smart Citation
“…As described in Algorithm 2, IB-DTree constructs a tree using a recursive procedure starting from the root node from lines 1-7 with all candidate classes in line 2. The node-adding procedure will be processed from lines [8][9][10][11][12][13][14][15][16][17][18][19]. First, the initial classifier h with the lowest entropy will be selected.…”
Section: The Information-based Decision Treementioning
confidence: 99%
“…Generalization error can be estimated directly using k-fold cross-validation and used to compare the performance of binary classifiers, but it consumes a high computational cost. Another method to estimate the generalization error is by using Inequation 2 with the appropriate parameter substitution [19]. Using the latter method, we can compare relative Algorithm 3 Information-Based and Generalization-Error Estimation Decision Tree SVM (IBGE-DTree)…”
mentioning
confidence: 99%
“…A more recent work has utilised DAG structures to solve the multiclass classification problem (Songsiri, Phetkaew & Kijsirikul, 2015). More specifically the DAG structure used to combine the prediction results obtained from a set of binary classifiers, which can be considered a special case of using a set of binary classifiers to solve the multi-class classification problem.…”
Section: Literature Reviewmentioning
confidence: 99%