2019
DOI: 10.1016/j.asoc.2019.105580
|View full text |Cite
|
Sign up to set email alerts
|

Maximizing diversity by transformed ensemble learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(14 citation statements)
references
References 34 publications
0
14
0
Order By: Relevance
“…[16] The former is focused on a pair of classifiers [17], [18], [19], while the latter seeks diversity presented by a whole ensemble [20], [21]. Various attempts were made to obtain the best possible diversity [22] such as tuning learning algorithms [23], [24], sampling methods [25], [26], and feature selection [27], [28], [29], [30].…”
Section: Related Work a Static Ensemble Selectionmentioning
confidence: 99%
“…[16] The former is focused on a pair of classifiers [17], [18], [19], while the latter seeks diversity presented by a whole ensemble [20], [21]. Various attempts were made to obtain the best possible diversity [22] such as tuning learning algorithms [23], [24], sampling methods [25], [26], and feature selection [27], [28], [29], [30].…”
Section: Related Work a Static Ensemble Selectionmentioning
confidence: 99%
“…They integrated unsupervised clustering with a fuzzy assignment process to make full use of data patterns to improve the ensemble performance. Mao et al [36] propose a transformation ensemble learning framework in which the combination of multiple base learners is converted into a linear transformation of all these base learners and which constructs an optimization objective function for balancing accuracy and diversity. The alternating direction multiplier method is used to solve this problem.…”
Section: Introductionmentioning
confidence: 99%
“…In previous studies [35,36], when the ensemble learning model was built, although the objective function includes accuracy and diversity factors, the diversity factor is used as a regularization term to avoid overfitting. Nevertheless, the diversity factor considers only the prediction results of the classifier.…”
Section: Introductionmentioning
confidence: 99%
“…While literature [30] and [31] describe that the ideal ensemble is constructed using learners of small error and good diversity. However, rich diversity may cause the predicted value of meta-learners to deviate from the true values, and the improvement of individual accuracy often reduces the diversity of meta-learners, that is, accuracy and diversity are usually conflicting to each other.…”
Section: Introductionmentioning
confidence: 99%