2019
DOI: 10.1109/access.2019.2914725
|View full text |Cite
|
Sign up to set email alerts
|

Resample-Based Ensemble Framework for Drifting Imbalanced Data Streams

Abstract: Machine learning in real-world scenarios is often challenged by concept drift and class imbalance. This paper proposes a Resample-based Ensemble Framework for Drifting Imbalanced Stream (RE-DI). The ensemble framework consists of a long-term static classifier to handle gradual and multiple dynamic classifiers to handle sudden concept drift. The weights of the ensemble classifier are adjusted from two aspects. First, a time-decayed strategy decreases the weights of the dynamic classifiers to make the ensemble c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(7 citation statements)
references
References 28 publications
0
7
0
Order By: Relevance
“…In [45] stratified bootstrapping is employed where each data chunk is used to learn a stratified bagging classifier which itself consists of several base estimators learned from a stratum of that data chunk. Another approach, namely RE-DI, is proposed in [46] which uses a long-term classifier to handle gradual concept drifts and multiple dynamic classifiers to handle sudden concept drifts. The classifier weights are influenced by two sources: a time-decaying mechanism used to focus on newer concepts and a reinforcement mechanism to increase the weight of classifiers that perform better on minority classes.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In [45] stratified bootstrapping is employed where each data chunk is used to learn a stratified bagging classifier which itself consists of several base estimators learned from a stratum of that data chunk. Another approach, namely RE-DI, is proposed in [46] which uses a long-term classifier to handle gradual concept drifts and multiple dynamic classifiers to handle sudden concept drifts. The classifier weights are influenced by two sources: a time-decaying mechanism used to focus on newer concepts and a reinforcement mechanism to increase the weight of classifiers that perform better on minority classes.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Weights assigned for each base classifier can be continuously updated to reflect their current competencies on minority classes (Ren et al, 2018). A reinforcement learning mechanism can be used to increase the weights of the base classifiers that perform better on the minority class (Zhang et al, 2019). One can use a hybrid approach that combines resampling minority instances with dynamic weighting base classifiers based on their predictive performance on sliding windows of minority samples (Yan et al, 2022).…”
Section: Ensembles For Imbalanced Data Streamsmentioning
confidence: 99%
“…The model may also use a dynamic approach where learners are added and removed dynamically, and rely on the most recent data to make a prediction [15]. Addictive Expert Ensembles (AddExp) [34], Dynamic Weighted Majority (DWM) [33], Geometrically Optimum and Online-Weighted Ensemble (GOOWE) [7], Learn++.NSE [18], Resample-based Ensemble Framework for Drifting Imbalanced Stream (RE-DI) [53] and Dynamic Adaption to Concept Changes (DACC) [12] fall into this category .…”
Section: Related Workmentioning
confidence: 99%