2015
DOI: 10.1002/widm.1158
|View full text |Cite
|
Sign up to set email alerts
|

Multi‐objective ensemble generation

Abstract: Ensemble methods that combine a committee of machine-learning models, each known as a member or base learner, have gained research interests in the past decade. One interest on ensemble generation involves the multi-objective approach, which attempts to generate both accurate and diverse members that fulfill the theoretical requirements of good ensembles. These methods resolve common difficulties of balancing the trade-off between accuracy and diversity and have been shown to be advantageous over single-object… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0
1

Year Published

2016
2016
2021
2021

Publication Types

Select...
4
3
1

Relationship

2
6

Authors

Journals

citations
Cited by 32 publications
(18 citation statements)
references
References 91 publications
0
17
0
1
Order By: Relevance
“…These techniques are categorized into explicit and implicit methods depending on whether optimization of a diversity metric is applied in ensemble construction [53]. Most of these techniques belong to the implicit type, such as training data manipulation including bagging [34], [54], feature selection and feature extraction [55], or use of different initial parameters, different training algorithms and different architectures of learners [56], while boosting [57] and negative correlation learning [58] belong to the explicit type. Heterogeneous ensembles have the natural advantages in structural diversity by using different architectures or different learning types of models [42], [55], [59].…”
Section: Heterogeneous Ensemble Generationmentioning
confidence: 99%
“…These techniques are categorized into explicit and implicit methods depending on whether optimization of a diversity metric is applied in ensemble construction [53]. Most of these techniques belong to the implicit type, such as training data manipulation including bagging [34], [54], feature selection and feature extraction [55], or use of different initial parameters, different training algorithms and different architectures of learners [56], while boosting [57] and negative correlation learning [58] belong to the explicit type. Heterogeneous ensembles have the natural advantages in structural diversity by using different architectures or different learning types of models [42], [55], [59].…”
Section: Heterogeneous Ensemble Generationmentioning
confidence: 99%
“…This can be done by the manipulation of: features; training instances; and/or model's hyperparameters. According to Gu et al (2015), the formulation of a MOP for finding the trade-off between prediction scores and complexity is presented as the most promising such task.…”
Section: Multi-objective Ensemble Generationmentioning
confidence: 99%
“…The application of MOO algorithms for the optimisation of ensemble learning models has been performed by many authors, and is known as multi-objective ensemble generation (MOEG) (Gu et al, 2015). Most of the works deal with ensemble member generation, where MOO is applied in order to generate a set of diverse predictive models (Tan et al, 2014;Smith and Jin, 2014;Rosales-Pérez et al, 2014;Gu and Jin, 2014;Rosales-Pérez et al, 2017;Peimankar et al, 2017;Ribeiro and Reynoso-Meza, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…There could be scope for research in determining time-related behavior in textile data. 72 Contributions are also expected on the topic of classifying textile data using ensemble learning 73 techniques in the coming years. In the textile industry, a single classifier has been used so far; however, combining multiple models can provide a more accurate prediction than depending on a single model.…”
Section: Future Directionsmentioning
confidence: 99%