2007
DOI: 10.1214/009053607000000785
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of boosting algorithms using the smooth margin function

Abstract: We introduce a useful tool for analyzing boosting algorithms called the ``smooth margin function,'' a differentiable approximation of the usual margin for boosting algorithms. We present two boosting algorithms based on this smooth margin, ``coordinate ascent boosting'' and ``approximate coordinate ascent boosting,'' which are similar to Freund and Schapire's AdaBoost algorithm and Breiman's arc-gv algorithm. We give convergence rates to the maximum margin solution for both of our algorithms and for arc-gv. We… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
15
0
1

Year Published

2007
2007
2013
2013

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(17 citation statements)
references
References 20 publications
1
15
0
1
Order By: Relevance
“…In this case, the algorithm shares the simplicity of the popular AdaBoost approach. The rate of convergence we obtain matches the rate of the AdaBoost ⋆ described by Ratsch and Warmuth [RW05] and is better than the rate obtained in Rudin et al [RSD07]. We note also that if A is γ-separable and we set ǫ = γ/2 then we would find a solution with half the optimal margin in O(log(m)/γ 2 ) iterations.…”
Section: Theorem 9 Assume That the Algorithm Given Insupporting
confidence: 78%
See 2 more Smart Citations
“…In this case, the algorithm shares the simplicity of the popular AdaBoost approach. The rate of convergence we obtain matches the rate of the AdaBoost ⋆ described by Ratsch and Warmuth [RW05] and is better than the rate obtained in Rudin et al [RSD07]. We note also that if A is γ-separable and we set ǫ = γ/2 then we would find a solution with half the optimal margin in O(log(m)/γ 2 ) iterations.…”
Section: Theorem 9 Assume That the Algorithm Given Insupporting
confidence: 78%
“…Therefore, our algorithm attains the same rate of convergence of AdaBoost while both algorithms obtain a margin which is half of the optimal margin. (See also the margin analysis of AdaBoost described in Rudin et al [RSD07]. )…”
Section: Theorem 9 Assume That the Algorithm Given Inmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that α arc t is non-negative since µ arc t ≤ ρ ≤ r arc t . We start our calculation from when the smooth margin is positive; if the data is separable, one can always use AdaBoost until the smooth margin is positive (see [15]). We denote by1 the first iteration where G is positive, so g arc 1 > 0.…”
Section: A Convergence Rate For Arc-gvmentioning
confidence: 99%
“…The smooth margin obeys a useful recursion relation which helps greatly in the analyses for both algorithms. For more detailed analysis of the smooth margin function see [14], and for detailed proofs, see the extended version of this work [15].…”
Section: Introductionmentioning
confidence: 99%