2018
DOI: 10.1145/3139240
|View full text |Cite
|
Sign up to set email alerts
|

Goowe

Abstract: Designing adaptive classifiers for an evolving data stream is a challenging task due to the data size and its dynamically changing nature. Combining individual classifiers in an online setting, the ensemble approach, is a well-known solution. It is possible that a subset of classifiers in the ensemble outperforms others in a time-varying fashion. However, optimum weight assignment for component classifiers is a problem, which is not yet fully addressed in online evolving environments. We propose a novel data s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 34 publications
(3 citation statements)
references
References 51 publications
0
3
0
Order By: Relevance
“…Wu and Crestani 32 first proposed a geometric framework for data fusion in the context of information retrieval. Bonab and Can 33,34 extended this framework to online multi-label data stream classification tasks. They used a dynamic weighting ensemble approach to achieve the optimal weights for all base classifiers at the data chunk level or sliding window level.…”
Section: Methodsmentioning
confidence: 99%
“…Wu and Crestani 32 first proposed a geometric framework for data fusion in the context of information retrieval. Bonab and Can 33,34 extended this framework to online multi-label data stream classification tasks. They used a dynamic weighting ensemble approach to achieve the optimal weights for all base classifiers at the data chunk level or sliding window level.…”
Section: Methodsmentioning
confidence: 99%
“…Category Brief Description Year Online Bagging (Oza and Russell, 2001) online Bagging to stationary data streams using Poison-based weighting Learn++ (Polikar et al, 2001) chunk-based An AdaBoost-like classification using incremental neural networks AWE (Wang et al, 2003) chunk-based Accuracy-Weighted Ensemble combined with instance-based pruning DWM (Kolter and Maloof, 2007) online DWM-based ensemble, considering individual and collective accuracy OCBoost (Pelossof et al, 2009) online Online Coordinate Boosting as an approximated online AdaBoost OzaBagAdwin (Bifet et al, 2009) online Online Bagging combined with ADWIN drift detector Leverage Bagging (Bifet et al, 2010b) online Extension of Online Bagging with randomized input and output Learn++.NSE (Elwell and Polikar, 2011) chunk-based A variation of Learn++, capable of dealing with novel classes AUE (Brzeziński and Stefanowski, 2011) hybrid An improvement of AWE using incremental classifiers ADACC (Jaber et al, 2013) online Anticipative Dynamic Adaptation to Recurring Changes OSBoost (Chen et al, 2012) online Online Smooth Boost based on convex optimization RCD (Gonçalves Jr and De Barros, 2013) online Recurring Drift handling using a KNN-based similarity AUE2 (Brzezinski and Stefanowski, 2013) hybrid An improvement of AUE in terms of weighting and updating phases OAUE (Brzezinski and Stefanowski, 2014) hybrid Online Accuracy Updated Ensemble as an improvement of AUE ADOB (Santos et al, 2014) online Adaptable Diversity-based Online Boosting by tuning diversity BOLE (de Barros et al, 2016) online A heuristic improvement of ADOB, combined with DDM ARF (Gomes et al, 2017b) online An adaptation of random forests for data streams BLAST(HEB) (van Rijn et al, 2018) online/chunk-based A heterogeneous ensemble based on OPE and COD GOOWE (Bonab and Can, 2018) chunk-based LLS-based Geometrically Optimum and Online-Weighted Ensemble KUE (Cano and Krawczyk, 2020) hybrid Kappa Updated Ensemble, as an extension to AUE SRP (Gomes et al, 2019) online Streaming Random Patches using Bagging and Random Subspaces DiWE (Liu et al, 2020) online Region Drift Disagreement in Diverse Instance-Weighting Ensemble CDCMS (Chiu and Minku, 2020) online Concept Drift Handling Based on Clustering in the Model Space PWPAE (Yang et al, 2021) online A Perfor...…”
Section: Approachmentioning
confidence: 99%
“…Geometrically Optimum and Online-Weighted Ensemble (GOOWE) (Bonab and Can, 2018) is proposed by H. R. Bonab and F. Can in 2018. The general model progress scheme is similar to AUE2 (Brzezinski and Stefanowski, 2013) and AWE (Wang et al, 2003).…”
Section: Chunk-based Approachesmentioning
confidence: 99%