2007
DOI: 10.3233/ida-2007-11102
|View full text |Cite
|
Sign up to set email alerts
|

Boosting classifiers for drifting concepts

Abstract: This paper proposes a boosting-like method to train a classifier ensemble from data streams. It naturally adapts to concept drift and allows to quantify the drift in terms of its base learners. The algorithm is empirically shown to outperform learning algorithms that ignore concept drift. It performs no worse than advanced adaptive time window and example selection strategies that store all the data and are thus not suited for mining massive streams. 1 A preliminary short version of this paper was presented at… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
67
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 93 publications
(67 citation statements)
references
References 22 publications
0
67
0
Order By: Relevance
“…In these cases, both the structure of an ensemble and the weights of existing classifiers are altered. Wang et al [2] and Scholz and Klinkenberg [15] recognize the high update costs associated with continuous global updates of a model in highdensity data streams. They devise a framework for rapidly revising only relevant components of an ensemble and initiate the training of new classifiers only when required.…”
Section: Past Researchmentioning
confidence: 99%
See 1 more Smart Citation
“…In these cases, both the structure of an ensemble and the weights of existing classifiers are altered. Wang et al [2] and Scholz and Klinkenberg [15] recognize the high update costs associated with continuous global updates of a model in highdensity data streams. They devise a framework for rapidly revising only relevant components of an ensemble and initiate the training of new classifiers only when required.…”
Section: Past Researchmentioning
confidence: 99%
“…Ensemble-based techniques have been shown to be an effective and scalable approach to addressing conceptdrifting challenges in streaming data [11,[13][14][15]. Ensemble-based systems consist of multiple classifiers that can be viewed as a committee of experts, whose individual votes are combined in order to formulate a final classification.…”
Section: Past Researchmentioning
confidence: 99%
“…However, recent research has concentrated on approaches that actively grow/replace ensemble membership. Moreover, it may even be possible to maintain multiple ensembles in order to: (1) react to cyclic behaviours in the stream [158]; or (2) explicitly maintain multiple ensembles with different diversity properties [138]. A summary of potential design decisions associated with ensembles as applied to streaming data might therefore include:…”
Section: Ensemble ML Perspectivementioning
confidence: 99%
“…• Incremental change to current knowledge (e.g., [107,158]) as opposed to the outright dropping of previous knowledge (e.g., [68,167]); • Adapting the weight associated with learners versus no weighting: Weight adaptation represents an intermediate level of refinement in which the models denoting the ensemble remain unchanged but their relative contribution to the voting is modified [2,90,152]. Conversely, weightless frameworks emphasize plasticity and tend to drop weaker ensemble members immediately (e.g., [81,167]); • Classifier weight adaption versus data instance based weight adaptation: The weighing of votes from an ensemble is generally a function of either classifier performance [54,65,152] or of the data from which a member of the ensemble was constructed (e.g., [20,150]); • Identification of ensemble member for replacement: Various heuristics have been proposed for targeting the ensemble member for replacement when performance as a whole is deemed to be poor, e.g., replace the oldest [167] or member with least 'contribution' [107,171].…”
Section: Ensemble ML Perspectivementioning
confidence: 99%
See 1 more Smart Citation