2017
DOI: 10.1155/2017/3405463
|View full text |Cite
|
Sign up to set email alerts
|

Improving Classification Performance through an Advanced Ensemble Based Heterogeneous Extreme Learning Machines

Abstract: Extreme Learning Machine (ELM) is a fast-learning algorithm for a single-hidden layer feedforward neural network (SLFN). It often has good generalization performance. However, there are chances that it might overfit the training data due to having more hidden nodes than needed. To address the generalization performance, we use a heterogeneous ensemble approach. We propose an Advanced ELM Ensemble (AELME) for classification, which includes Regularized-ELM, L2-norm-optimized ELM (ELML2), and Kernel-ELM. The ense… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
33
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 42 publications
(33 citation statements)
references
References 29 publications
0
33
0
Order By: Relevance
“…AdaBoost Advances in Civil Engineering ensemble of CTrees can be defined as a combination of multiple CTrees in which the final prediction result is obtained by combining the outputs of individual trees. Based on previous works [52][53][54][55][56], ensemble models have demonstrated better performance than individual models in a wide range of applications. e AdaBoost algorithm is demonstrated in Figure 6.…”
Section: Adaptive Boosting Classificationmentioning
confidence: 99%
“…AdaBoost Advances in Civil Engineering ensemble of CTrees can be defined as a combination of multiple CTrees in which the final prediction result is obtained by combining the outputs of individual trees. Based on previous works [52][53][54][55][56], ensemble models have demonstrated better performance than individual models in a wide range of applications. e AdaBoost algorithm is demonstrated in Figure 6.…”
Section: Adaptive Boosting Classificationmentioning
confidence: 99%
“…Ensemble method is a machine learning technique having collection of several classifiers to improve the overall predictive performance (28,29) . The present section of model building & evaluation deals with heterogeneous ensemble called Voting-Boosting (VB) ensemble and its architecture is shown in Figure 2.…”
Section: Model Building and Evaluationmentioning
confidence: 99%
“…This approach has shown to be effective for both regression and classification problems although it may require more nodes than networks trained with gradient-based methods like back-propagation [24] due to the randomization of the hidden layer weights and thresholds [72,69]. Furthermore, the ELM algorithm is unable to create SLNNs containing different combinations of neuron types 1 in the hidden layer (these network types will 1 The term different combinations of neuron types refers to neuron types with structural units like the structured composite model (C-Model) proposed by Christou et al [9] which is utilized in the proposed algorithm. This model divides the neuron into three subunits (subcomponents) named dendrite (D), activation (S a ) function, activation-output (S ao ) function and is able to create a custom neuron by varying any of the subunits from each category.…”
Section: Introductionmentioning
confidence: 99%
“…The synaptic kernel inverse method (SKIM) by Tapson et al [59] for event-based systems redefines the hidden neurons as synaptic kernels in which the input event based signals are transformed into continuous-valued signals. The advanced ELM ensemble (AELME) from Abuassba et al [1] constructs a network ensemble by training a randomly chosen ELM classifier on a subset of training data selected through random resampling. These methodologies work only with original units and they are not taking into consideration higher order units.…”
Section: Introductionmentioning
confidence: 99%