2015 IEEE International Conference on Data Mining 2015
DOI: 10.1109/icdm.2015.43
|View full text |Cite
|
Sign up to set email alerts
|

The ABACOC Algorithm: A Novel Approach for Nonparametric Classification of Data Streams

Abstract: Abstract-Stream mining poses unique challenges to machine learning: predictive models are required to be scalable, incrementally trainable, must remain bounded in size (even when the data stream is arbitrarily long), and be nonparametric in order to achieve high accuracy even in complex and dynamic environments. Moreover, the learning system must be parameterless -traditional tuning methods are problematic in streaming settings-and avoid requiring prior knowledge of the number of distinct class labels occurrin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
25
1

Year Published

2017
2017
2021
2021

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(26 citation statements)
references
References 20 publications
0
25
1
Order By: Relevance
“…We modify the frequency-based part of this method to classify novel classes, but it did not perform well. It is also difficult to train ABACOC [27], another incremental learning method, with few samples per class. NCM [22], NCT [23] and 1-NN [28] classifier can learn the user's data incrementally and achieve better performance that CNN, but the speed of personalization of NCM and NCT is slow and 1-NN has a cold-start problem.…”
Section: Dataset and Proceduresmentioning
confidence: 99%
“…We modify the frequency-based part of this method to classify novel classes, but it did not perform well. It is also difficult to train ABACOC [27], another incremental learning method, with few samples per class. NCM [22], NCT [23] and 1-NN [28] classifier can learn the user's data incrementally and achieve better performance that CNN, but the speed of personalization of NCM and NCT is slow and 1-NN has a cold-start problem.…”
Section: Dataset and Proceduresmentioning
confidence: 99%
“…The number of balls is dependent on the complexity of the classification problem. Unlike (De Rosa et al, 2014), where the balls were always centered on input samples, we extend the AUTO-ADJ version of ABACOC described in (De Rosa et al, 2015). In particular, a K-means-like update step makes the centre of each ball shift towards the average of the training samples that were correctly predicted by the local classifier; in other words, the balls track the feature clusters.…”
Section: (Passive) Incremental Learningmentioning
confidence: 99%
“…In order to curb the system's memory footprint, we adopt the simple approach proposed in (De Rosa et al, 2015), which is based on deleting existing balls whenever a given budget parameter on the label query rate is attained. This is crucial for real-time applications, as NN search, used in both training and prediction, takes time logarithmic in the number of balls.…”
Section: Constant Model Sizementioning
confidence: 99%
See 2 more Smart Citations