2017
DOI: 10.1007/978-3-319-59162-9_50
|View full text |Cite
|
Sign up to set email alerts
|

Combining Active Learning and Self-Labeling for Data Stream Mining

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

4
5

Authors

Journals

citations
Cited by 12 publications
(12 citation statements)
references
References 12 publications
0
12
0
Order By: Relevance
“…Bouguelia et al [11] proposed a new query strategy based on instance weighting, but the time complexity of the weigh calculation was relatively high. Korycki and Krawczyk [31] applied an active learning strategy to classify data streams and combine it with the self-labeling approach.…”
Section: Active Learning Methods For Data Stream Classificationmentioning
confidence: 99%
“…Bouguelia et al [11] proposed a new query strategy based on instance weighting, but the time complexity of the weigh calculation was relatively high. Korycki and Krawczyk [31] applied an active learning strategy to classify data streams and combine it with the self-labeling approach.…”
Section: Active Learning Methods For Data Stream Classificationmentioning
confidence: 99%
“…It is described as a highly scalable, parallelizable and optimized online Bayesian framework using sequential Monte Carlo and so-called gap assumption. In our previous work, we have proposed the first study on a combination of active and self-labeling solutions for data streams [23]. However, the algorithm described there used a fixed threshold and therefore was unable to efficiently adapt to the presence of concept drift.…”
Section: Labeling Constraints In Data Streamsmentioning
confidence: 99%
“…This may be prohibitive in many streaming scenarios, especially if we add the fundamental cluster and smoothness requirements. In fact, results from some publications suggest that either feasible improvements from unlabeled data are barely significant or very unstable and highly dependent on characteristics of a specific stream [51].…”
Section: Limited Budget and Underfittingmentioning
confidence: 99%
“…Thus, the very few labeled objects we obtain are essential for keeping our models up-to-date and we should exploit them as much as possible to avoid the waste of potential benefits. Finally, since active learning methods can be seen as exploration methods and semi-supervised ones as regularization-exploitation, approaches combining both of them are a promising research direction [48,51]. However, due to the mentioned problems with dynamic environments and using unsupervised input, in this work, we focus on investigating how significant improvements can be obtained solely from the exploitation of scare but reliable supervised information, provided with actively selected instances.…”
Section: Limited Budget and Underfittingmentioning
confidence: 99%