2012
DOI: 10.1109/tkde.2011.58
|View full text |Cite
|
Sign up to set email alerts
|

DDD: A New Ensemble Approach for Dealing with Concept Drift

Abstract: Abstract-Online learning algorithms often have to operate in the presence of concept drifts. A recent study revealed that different diversity levels in an ensemble of learning machines are required in order to maintain high generalisation on both old and new concepts. Inspired by this study and based on a further study of diversity with different strategies to deal with drifts, we propose a new online ensemble learning approach called Diversity for Dealing with Drifts (DDD). DDD maintains ensembles with differ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
281
0
1

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 381 publications
(283 citation statements)
references
References 32 publications
1
281
0
1
Order By: Relevance
“…This is not unexpected, because a learner cannot learn what has not been taught. Such a situation might be better handled by techniques for detecting concept drifts (Minku and Yao 2012b). Similarly, on time steps 30-37, both CC-RT0 and WC-RT win frequently, causing their weights to increase competitively during this period.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…This is not unexpected, because a learner cannot learn what has not been taught. Such a situation might be better handled by techniques for detecting concept drifts (Minku and Yao 2012b). Similarly, on time steps 30-37, both CC-RT0 and WC-RT win frequently, causing their weights to increase competitively during this period.…”
Section: Discussionmentioning
confidence: 99%
“…5 was performed, as both DCL and the RTs used in this study are deterministic and the datasets must have their order of examples fixed to represent the real online learning scenario of a company. This is a standard procedure when evaluating online learning approaches (Minku and Yao 2012b;Kolter and Maloof 2007).…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…It can offer partly help in detecting sudden changes to use small size chunks, while this may also damage the stability and computational costs performance of the ensemble. Simple incremental learning [9][10][11][12][13][14] keeps sensitivity to sudden changes to make self-adaption timely but is not enough for coping with gradual drifts as forgetting history data and sharp adaptation to newest status. In order to adapt to both sudden and gradual changes, it could be suitable to combine significant features from block-based ensembles and incremental learning approaches.…”
Section: Introductionmentioning
confidence: 99%