2004
DOI: 10.3233/ida-2004-8305
|View full text |Cite
|
Sign up to set email alerts
|

Learning drifting concepts: Example selection vs. example weighting

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
175
0
7

Year Published

2009
2009
2022
2022

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 357 publications
(182 citation statements)
references
References 14 publications
0
175
0
7
Order By: Relevance
“…-Moving window: [53] -Partial memory learning: [54] -Instance weighting and selection: [55] -Gradual forgetting: [56].…”
Section: Adaptation Mechanismsmentioning
confidence: 99%
“…-Moving window: [53] -Partial memory learning: [54] -Instance weighting and selection: [55] -Gradual forgetting: [56].…”
Section: Adaptation Mechanismsmentioning
confidence: 99%
“…In the context of learning data streams, some proposed algorithms are capable of dealing with gradual concept drift [23], some can handle abrupt concept drift [4,6,12,13,31], and some have the potential to cope with both types [36]. However, most of these methods are appropriate only for supervised environments in which the labels of data are fully known.…”
Section: Related Workmentioning
confidence: 99%
“…There exist several techniques: instance selection, instance weighting, and ensemble learning. Instance selection (Klinkenberg 2004) is the best-known technique and includes two methodologies: fixed window and adaptive window, where the model is regenerated using the last data batches. The instance selection technique suffers the problem of window size in case of fixed size and the pace of drift when adaptive windowing is adopted.…”
Section: Drift Handlingmentioning
confidence: 99%