2015
DOI: 10.1155/2015/835357
|View full text |Cite
|
Sign up to set email alerts
|

Analysis of Generalization Ability for Different AdaBoost Variants Based on Classification and Regression Trees

Abstract: As a machine learning method, AdaBoost is widely applied to data classification and object detection because of its robustness and efficiency. AdaBoost constructs a global and optimal combination of weak classifiers based on a sample reweighting. It is known that this kind of combination improves the classification performance tremendously. As the popularity of AdaBoost increases, many variants have been proposed to improve the performance of AdaBoost. Then, a lot of comparison and review studies for AdaBoost … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(13 citation statements)
references
References 37 publications
0
13
0
Order By: Relevance
“…As mentioned in Section 1, we use an ensemble of classifiers to label an RoI as an AD or a normal site. There are two popular ensemble classifiers, AdaBoost [8,20,21] and Random Forest [9], both of which build a classifier with multiple weak learners. The reader may note that weak learners in AdaBoost are trained on feature vectors that are assigned unequal weights based on their difficulty level in classification.…”
Section: Fig 2 Overview Of Our Ensemble Based Approach the Proposedmentioning
confidence: 99%
See 2 more Smart Citations
“…As mentioned in Section 1, we use an ensemble of classifiers to label an RoI as an AD or a normal site. There are two popular ensemble classifiers, AdaBoost [8,20,21] and Random Forest [9], both of which build a classifier with multiple weak learners. The reader may note that weak learners in AdaBoost are trained on feature vectors that are assigned unequal weights based on their difficulty level in classification.…”
Section: Fig 2 Overview Of Our Ensemble Based Approach the Proposedmentioning
confidence: 99%
“…An attempt of the detection of AD that looks similar to the example on multiple radiologists is weighing the decisions of multiple classifiers (in place of radiologists) in coming to a final decision regarding a particular RoI. Results on different pattern recognition problems show that an ensemble performs better [8] than a single classifier in certain problems.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In literature AdaBoost are used for detecting objects [26] ,detecting faults in gearbox [27] and detecting hands [28] In another paper, AdaBoost is used in mobile intelligent terminals for security level classification [29] Various modified AdaBoost algorithms are were analyzed for classification margin [30] Further, in the same paper, generalized error for eight variants of AdaBoost was reported. Personalized AdaBoost which is a variant of AdaBoost as reported in the paper has the least generalization error…”
Section: Adaboost Classifiermentioning
confidence: 99%
“…α is a weighting factor for each weak classifier g, S being the overall classification of the function threshold G. Different variations of the doping algorithm were developed : Discrete AdaBoost [8] Real AdaBoost [18] Gentle AdaBoost, etc. a recent performance of these alternatives analysis is presented in [19]. We use here the first defined by the following pseudo algorithm: For each descriptor (j), train a classifier (G i ).…”
Section: Classifiermentioning
confidence: 99%