2016 12th World Congress on Intelligent Control and Automation (WCICA) 2016
DOI: 10.1109/wcica.2016.7578244
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic ensemble selection methods for heterogeneous data mining

Abstract: Abstract-Big data is often collected from multiple sources with possibly different features, representations and granularity and hence is defined as heterogeneous data. Such multiple datasets need to be fused together in some ways for further analysis. Data fusion at feature level requires domain knowledge and can be time-consuming and ineffective, but it could be avoided if decision-level fusion is applied properly. Ensemble methods appear to be an appropriate paradigm to do just that as each subset of hetero… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
3
3
1

Relationship

2
5

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 14 publications
0
4
0
Order By: Relevance
“…It stops adding models into the ensemble when the ensemble's performance starts to decrease after achieving the best performance. Ensemble selection allows ensembles to be optimized to performance metrics such as accuracy, cross-entropy, mean precision, or ROC Area (Ballard & Wang, 2016;Nguyen et al, 2020).…”
Section: Ensemble Selectionmentioning
confidence: 99%
“…It stops adding models into the ensemble when the ensemble's performance starts to decrease after achieving the best performance. Ensemble selection allows ensembles to be optimized to performance metrics such as accuracy, cross-entropy, mean precision, or ROC Area (Ballard & Wang, 2016;Nguyen et al, 2020).…”
Section: Ensemble Selectionmentioning
confidence: 99%
“…Ballard and Wang [16] developed a dynamic ensemble selection methods for heterogeneous data mining. Although their datasets are not multimedia, their basic idea of combining multiple datasets at decision level inspired this work.…”
Section: Related Workmentioning
confidence: 99%
“…Since local accuracy is a key feature of the DES method, many algorithms use k-nearest neighbors as a framework [19, 26, 27]. Other methods for generating different homogeneous classifiers have been proposed, including random subspace [28], bagging [29], boosting [30], and clustering [31].…”
Section: Introductionmentioning
confidence: 99%