2018
DOI: 10.1109/tcyb.2016.2623900
|View full text |Cite
|
Sign up to set email alerts
|

A Novel AdaBoost Framework With Robust Threshold and Structural Optimization

Abstract: The AdaBoost algorithm is a popular ensemble method that combines several weak learners to boost generalization performance. However, conventional AdaBoost.RT algorithms suffer from the limitation that the threshold value must be manually specified rather than chosen through a self-adaptive mechanism, which cannot guarantee a result in an optimal model for general cases. In this paper, we present a generic AdaBoost framework with robust threshold mechanism and structural optimization on regression problems. Th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
21
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 65 publications
(21 citation statements)
references
References 34 publications
0
21
0
Order By: Relevance
“…Although ELM is a relatively recent intelligent technique, the theories and related applications have been investigated extensively, such as kernel ELM, incremental ELM (I-ELM), Bayesian ELM, adaboost ELM [30], multi-layer ELM-LRF [31], hybrid ELMs [32], etc. Particularly, ELM auto-encoder (ELM-AE) [33] is proposed to introduce unsupervised learning for ELM, by projecting the input data into a different dimensional space.…”
Section: B Extreme Learning Machine (Elm)mentioning
confidence: 99%
“…Although ELM is a relatively recent intelligent technique, the theories and related applications have been investigated extensively, such as kernel ELM, incremental ELM (I-ELM), Bayesian ELM, adaboost ELM [30], multi-layer ELM-LRF [31], hybrid ELMs [32], etc. Particularly, ELM auto-encoder (ELM-AE) [33] is proposed to introduce unsupervised learning for ELM, by projecting the input data into a different dimensional space.…”
Section: B Extreme Learning Machine (Elm)mentioning
confidence: 99%
“…This shows that the SCN after training is less robust in this case. Combining models with ensemble learning methods is a common choice in solving the problem of low robustness [26,27]. In the field of ensemble learning methods, Boosting and Bagging are two representative types of methods.…”
Section: Related Workmentioning
confidence: 99%
“…By directly minimizing the computational cost of the cascade, it searched for the best partition point of each stage and provided theoretical support to ensure the existence of a unique optimal solution. P Zhang [25] introduced a single-layer neural network to optimize the threshold based on the adaptive ELM with Sshaped activation function, and also employed a grid search strategy to select the regularization parameter C from a wide range. W M [26] used AdaBoost method to find samples corresponding with larger weights, and removed them as possible outliers, and then retrained and redesigned the classifier model.…”
Section: Introductionmentioning
confidence: 99%