2003
DOI: 10.1214/aoap/1042765667
|View full text |Cite
|
Sign up to set email alerts
|

Bounding the generalization error of convex combinations of classifiers: balancing the dimensionality and the margins

Abstract: A problem of bounding the generalization error of a classifier f ∈ conv(H), where H is a "base" class of functions (classifiers), is considered. This problem frequently occurs in computer learning, where efficient algorithms of combining simple classifiers into a complex one (such as boosting and bagging) have attracted a lot of attention. Using Talagrand's concentration inequalities for empirical processes, we obtain new sharper bounds on the generalization error of combined classifiers that take into account… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
30
0

Year Published

2003
2003
2013
2013

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 17 publications
(30 citation statements)
references
References 24 publications
0
30
0
Order By: Relevance
“…The second term of the bound is of the order ( √ nδ) −1 , and will also be small for large n, which makes the bound meaningful. This result was extended by Schapire and Singer in [29] to classes of realvalued functions, namely, to so-called VC-subgraph classes (for definition see [32]), and was further extended in several directions in [19] and [21]. The main idea of this follow-up work was to replace the second term of the bound proved by Schapire et al [28] by a function ε n (F; δ; t) that has better dependence on the sample size n and on the margin parameter δ.…”
mentioning
confidence: 99%
See 4 more Smart Citations
“…The second term of the bound is of the order ( √ nδ) −1 , and will also be small for large n, which makes the bound meaningful. This result was extended by Schapire and Singer in [29] to classes of realvalued functions, namely, to so-called VC-subgraph classes (for definition see [32]), and was further extended in several directions in [19] and [21]. The main idea of this follow-up work was to replace the second term of the bound proved by Schapire et al [28] by a function ε n (F; δ; t) that has better dependence on the sample size n and on the margin parameter δ.…”
mentioning
confidence: 99%
“…In [21] Koltchinskii, Panchenko and Lozano proved the bounds on generalization error under more general assumption on the entropy of the class F :…”
mentioning
confidence: 99%
See 3 more Smart Citations