CVPR 2011 2011
DOI: 10.1109/cvpr.2011.5995605
|View full text |Cite
|
Sign up to set email alerts
|

TaylorBoost: First and second-order boosting algorithms with explicit margin control

Abstract: Abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(25 citation statements)
references
References 11 publications
0
25
0
Order By: Relevance
“…With smaller trees, MCDB gives the best results on all datasets indicating that MCDB only requires very weak base classifiers even if there is no exact weak learner condition for MCDB. GD-MCBoost achieves the second best accuracy, and this algorithm also requires weaker base classifiers since it is able to boost any type of weak learners with non-zero directional derivatives [14]. We do not report its results on Poker525k dataset since its one iteration takes more than 12 hours to run by authors' matlab code.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…With smaller trees, MCDB gives the best results on all datasets indicating that MCDB only requires very weak base classifiers even if there is no exact weak learner condition for MCDB. GD-MCBoost achieves the second best accuracy, and this algorithm also requires weaker base classifiers since it is able to boost any type of weak learners with non-zero directional derivatives [14]. We do not report its results on Poker525k dataset since its one iteration takes more than 12 hours to run by authors' matlab code.…”
Section: Resultsmentioning
confidence: 99%
“…For comparison, we also report the results of AdaBoost.M1 [6], AdaBoost.MH [16], SAMME [30], and GD-MCBoost [14]. All these algorithms use multi-class base classifiers except AdaBoost.MH, which reduces the multi-class problem to a set of binary classification problems.…”
Section: Resultsmentioning
confidence: 99%
“…Using a Taylor series expansion of R(f k + g) around f k , the first and second order functional variations along the direction of weak learner g(x) ∈ G are [18] δR(f…”
Section: Boostingmentioning
confidence: 99%
“…The proposed boosting algorithm is based on the TaylorBoost framework [18]. Assume that after k iterations the predictor has m terms,…”
Section: Boosting Sum Of Products Sop-boostmentioning
confidence: 99%
See 1 more Smart Citation