2015 IEEE International Conference on Data Mining 2015
DOI: 10.1109/icdm.2015.55
|View full text |Cite
|
Sign up to set email alerts
|

Having a Blast: Meta-Learning and Heterogeneous Ensembles for Data Streams

Abstract: Ensembles of classifiers are among the best performing classifiers available in many data mining applications. However, most ensembles developed specifically for the dynamic data stream setting rely on only one type of base-level classifier, most often Hoeffding Trees. In this paper, we study the use of heterogeneous ensembles, comprised of fundamentally different model types. Heterogeneous ensembles have proven successful in the classical batch data setting, however they do not easily transfer to the data str… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 41 publications
(33 citation statements)
references
References 28 publications
0
33
0
Order By: Relevance
“…For the data stream classification, a Heterogeneous Ensemble with Feature drift for Data Streams (HEFT-Stream) Nguyen et al (2012) builds a heterogeneous ensemble composed of different online classifiers (e.g., Online Naive Bayes). Adaptive modifications of the heterogeneous ensembles were also successfully applied on the drifting data streams ( Van Rijn et al, 2016 ; Frías-Blanco et al, 2016 ; Van Rijn et al, 2018 ; Idrees et al, 2020 ), and many of them proved suitable to address issues such as class imbalance ( Large, Lines & Bagnall, 2017 ; Fernández et al, 2018 ; Ren et al, 2018 ; Wang, Minku & Yao, 2018 ; Ghaderi Zefrehi & Altınçay, 2020 ). The approach described in this paper aims to combine the construction of the adaptive heterogeneous ensemble with a diversity-based update of the ensemble members.…”
Section: Related Workmentioning
confidence: 99%
“…For the data stream classification, a Heterogeneous Ensemble with Feature drift for Data Streams (HEFT-Stream) Nguyen et al (2012) builds a heterogeneous ensemble composed of different online classifiers (e.g., Online Naive Bayes). Adaptive modifications of the heterogeneous ensembles were also successfully applied on the drifting data streams ( Van Rijn et al, 2016 ; Frías-Blanco et al, 2016 ; Van Rijn et al, 2018 ; Idrees et al, 2020 ), and many of them proved suitable to address issues such as class imbalance ( Large, Lines & Bagnall, 2017 ; Fernández et al, 2018 ; Ren et al, 2018 ; Wang, Minku & Yao, 2018 ; Ghaderi Zefrehi & Altınçay, 2020 ). The approach described in this paper aims to combine the construction of the adaptive heterogeneous ensemble with a diversity-based update of the ensemble members.…”
Section: Related Workmentioning
confidence: 99%
“…BLAST (short for best last) [42] introduced an Online Performance Estimation framework to weight the votes of (heterogeneous) ensemble members. Based on zero/one loss function, i.e.…”
Section: Heterogeneous Ensemblesmentioning
confidence: 99%
“…Online Gradient Descent) and NOGD (Nystrom Online Gradient Descent) [2]; ensemble-based BLAST [3]; state-of-the-art second-order linear AROW (Adaptive Regularization of Weights) [4]; widely used first-order linear PA (Passive Aggressive learning) [5], the most used decision tree HT (Hoeffding Tree) [6]. We also compare our proposed methods with ONBG (Online Naïve Bayes for Gaussians) [7] a first-order generative method.…”
Section: Accepted Manuscriptmentioning
confidence: 99%
“…Bayesian framework (BE) [27] was proposed for recursively estimating the classifier weights in online learning of a classifier ensemble. Another recent interesting work, named as BLAST [3], studies the use of heterogeneous ensembles comprised of fundamentally different model types.…”
Section: Online Classifiersmentioning
confidence: 99%
See 1 more Smart Citation