2019
DOI: 10.1155/2019/8906034
|View full text |Cite
|
Sign up to set email alerts
|

Two‐Stage Bagging Pruning for Reducing the Ensemble Size and Improving the Classification Performance

Abstract: Ensemble methods, such as the traditional bagging algorithm, can usually improve the performance of a single classifier. However, they usually require large storage space as well as relatively time-consuming predictions. Many approaches were developed to reduce the ensemble size and improve the classification performance by pruning the traditional bagging algorithms. In this article, we proposed a two-stage strategy to prune the traditional bagging algorithm by combining two simple approaches: accuracy-based p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
8
1
1

Relationship

1
9

Authors

Journals

citations
Cited by 15 publications
(7 citation statements)
references
References 35 publications
0
7
0
Order By: Relevance
“…Bagging or bootstrap aggregation method also uses ensemble learning to evolve ML models. 42 This algorithm is based on the hypothesis that combining multiple models together can often produce a much more powerful model. 43 Lreg is one of the simplest and most common ML algorithms that has been used often in low dimension data for binary classification problems.…”
Section: Methodsmentioning
confidence: 99%
“…Bagging or bootstrap aggregation method also uses ensemble learning to evolve ML models. 42 This algorithm is based on the hypothesis that combining multiple models together can often produce a much more powerful model. 43 Lreg is one of the simplest and most common ML algorithms that has been used often in low dimension data for binary classification problems.…”
Section: Methodsmentioning
confidence: 99%
“…Although RF subtrees grow based on the training dataset through bagging, the average for all single trees is trained via parallel processing, which focuses only on a combination of strong prediction models [51], leading to data bias. Bagging, despite its widespread use in machine learning models to improve overfitting and reduce the variance of the data [52], suffers from the limitations of a small data quantity, data distribution and data quality issues.…”
Section: -10 Min_samples_leafmentioning
confidence: 99%
“…The classical bagging approach is comprised of 2 major modules like Bootstrap and Aggregation. Initially, the count of subsets is sampled in random fashion from actual training set with the help of bootstrap sampling principle [14] with substitution. Alternatively, bagging method is used in collecting the outputs of base methods with the help of voting principle for classification process.…”
Section: Bc Modelmentioning
confidence: 99%