2014
DOI: 10.1109/tnnls.2012.2236570
|View full text |Cite
|
Sign up to set email alerts
|

Active Learning With Drifting Streaming Data

Abstract: In learning to classify streaming data, obtaining true labels may require major effort and may incur excessive cost. Active learning focuses on carefully selecting as few labeled instances as possible for learning an accurate predictive model. Streaming data poses additional challenges for active learning, since the data distribution may change over time (concept drift) and models need to adapt. Conventional active learning strategies concentrate on querying the most uncertain instances, which are typically co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
273
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 316 publications
(275 citation statements)
references
References 15 publications
2
273
0
Order By: Relevance
“…This is far from real-life scenarios and would impose tremendous labeling costs on the system. Therefore, method for streaming data that reduce the cost of supervision (e.g., by active learning) is currently of crucial importance in this field [69]. This raises an open issue on how to sample imbalanced data streams?…”
Section: Learning From Imbalanced Data Streamsmentioning
confidence: 99%
“…This is far from real-life scenarios and would impose tremendous labeling costs on the system. Therefore, method for streaming data that reduce the cost of supervision (e.g., by active learning) is currently of crucial importance in this field [69]. This raises an open issue on how to sample imbalanced data streams?…”
Section: Learning From Imbalanced Data Streamsmentioning
confidence: 99%
“…liobaitė et al [15] presents a generic framework for active learning incrementally from drifting data streams. Several active learning strategies are incorporated into the framework.…”
Section: Related Workmentioning
confidence: 99%
“…OnPEAL is compared with the paired ensemble framework for active learning (PEFAL) presented in [16] and three representative active learning strategies described in [15], including Variable Uncertainty Strategy (VarUn), Uncertainty Strategy with Randomization (RanVarUn), and Split Strategy (Split). All the experiments are performed using the MOA data stream software suite [19].…”
Section: Experimental Evaluationmentioning
confidence: 99%
See 1 more Smart Citation
“…Random strategy), queries are blindly chosen. The Random strategy (Zliobaite et al 2014) labels the incoming batches randomly instead of wisely deciding which batches are more informative. Constrained by budget, batches are sent for annotation.…”
Section: Unwise Strategymentioning
confidence: 99%