2006
DOI: 10.1016/j.neucom.2005.12.014
|View full text |Cite
|
Sign up to set email alerts
|

Evolving hybrid ensembles of learning machines for better generalisation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
50
1
1

Year Published

2008
2008
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 132 publications
(52 citation statements)
references
References 21 publications
0
50
1
1
Order By: Relevance
“…This includes the inference procedures and excellent predictive capabilities deployed by the techniques. A single overall technique that comes out of this approach of combining two or more existing techniques is called a hybrid system (Chandra and Yao 2006;Khashei et al 2011). It is an approach that combines different theoretical backgrounds and algorithms such as data mining and soft computing methodologies.…”
Section: Hybrid Intelligent Systemsmentioning
confidence: 99%
“…This includes the inference procedures and excellent predictive capabilities deployed by the techniques. A single overall technique that comes out of this approach of combining two or more existing techniques is called a hybrid system (Chandra and Yao 2006;Khashei et al 2011). It is an approach that combines different theoretical backgrounds and algorithms such as data mining and soft computing methodologies.…”
Section: Hybrid Intelligent Systemsmentioning
confidence: 99%
“…Nevertheless, ensemble learning methods which directly encourage diversity by considering the interaction among all ensemble members during the learning, such as NCL, have been showing to outperform other ensemble learning approaches in off-line mode [10], [11], [21]. So, ensemble learning methods in which the ensemble members are not trained independently have a potential advantage over methods which perform independent training in on-line mode.…”
Section: A On-line Ensemble Learningmentioning
confidence: 99%
“…In this method, the training of an ensemble member is influenced by the others, directly encouraging diversity through the use of a penalty correlation term in the error function of the base model learning algorithm. In off-line mode, ensembles which directly encourage diversity considering the interaction among all the ensemble members during the learning [10], [11], [21] have been showing to outperform other ensemble learning methods such as bagging and boosting. So, this is a potentially powerful advantage of NCL over the other on-line ensemble methods existent in the literature.…”
Section: A On-line Ensemble Learningmentioning
confidence: 99%
“…Classification approaches based on ensemble learning is a way to combine weak learners to produce a strong learner (Chandra and Yao, 2006). In such cases, handling of small inaccuracies in the training data is not crucial for ensemble classifiers (when comprised of advanced non-parametric classifiers such as neural networks and Support Vector Machines) (Colditz et al, 2008).…”
Section: Introductionmentioning
confidence: 99%
“…In a study proposed by (Brodley and Friedl, 2011), a consensus voting scheme is employed to filter results from an ensemble of classifiers to eliminate mislabeled samples. Other methodologies to remove mislabels using predicted membership in other fields, like computer vision and machine learning, can be seen in (Chandola et al, 2009;Escalante, 2005;Hodge and Austin, 2004). …”
mentioning
confidence: 99%