Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2014
DOI: 10.1145/2623330.2623689
|View full text |Cite
|
Sign up to set email alerts
|

A multi-class boosting method with direct optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 26 publications
0
1
0
Order By: Relevance
“…In the first one, named binary reduction, a multi-class problem is recast to a combination of a number of binary sub-problems, each to be solved with the original AdaBoost. Unfortunately, binary reduction is subject to a number of problems (Zhai et al, 2014). The second scheme plans to directly make use of native multi-class base classifiers, and use them in the context of a boosting procedure.…”
Section: Introductionmentioning
confidence: 99%
“…In the first one, named binary reduction, a multi-class problem is recast to a combination of a number of binary sub-problems, each to be solved with the original AdaBoost. Unfortunately, binary reduction is subject to a number of problems (Zhai et al, 2014). The second scheme plans to directly make use of native multi-class base classifiers, and use them in the context of a boosting procedure.…”
Section: Introductionmentioning
confidence: 99%