2017
DOI: 10.1007/s10994-017-5642-8
|View full text |Cite|
|
Sign up to set email alerts
|

Adaptive random forests for evolving data stream classification

Abstract: Random forests is currently one of the most used machine learning algorithms in the non-streaming (batch) setting. This preference is attributable to its high learning performance and low demands with respect to input preparation and hyper-parameter tuning. However, in the challenging context of evolving data streams, there is no random forests algorithm that can be considered state-of-the-art in comparison to bagging and boosting based algorithms.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
433
0
7

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 571 publications
(444 citation statements)
references
References 39 publications
(66 reference statements)
4
433
0
7
Order By: Relevance
“…Interestingly, the accuracy ranks (shown in parentheses) of three out of four of these classifiers indicate that they collectively accounted for winners across the entire range of datasets. The accuracy profiles of LB and ARF reinforce results from previous studies where they excelled in performance Bifet et al (2010b), Gomes et al (2017). The strong performance of ARF is to be expected as the basic Random Forest algorithm has proved its superior performance in numerous studies and ARF builds on this success by adapting the forest to concept drift in a data stream environment.…”
Section: Accuracy Evaluationsupporting
confidence: 78%
See 2 more Smart Citations
“…Interestingly, the accuracy ranks (shown in parentheses) of three out of four of these classifiers indicate that they collectively accounted for winners across the entire range of datasets. The accuracy profiles of LB and ARF reinforce results from previous studies where they excelled in performance Bifet et al (2010b), Gomes et al (2017). The strong performance of ARF is to be expected as the basic Random Forest algorithm has proved its superior performance in numerous studies and ARF builds on this success by adapting the forest to concept drift in a data stream environment.…”
Section: Accuracy Evaluationsupporting
confidence: 78%
“…The nine meta learning algorithms derived from MOA are: Adaptive Random Forest (ARF) Gomes et al (2017), OzaBagASHT (AS), OzaBagADWIN (OB) Bifet et al (2009), LeveragingBag (LB) Bifet et al (2010b), LimAttClassifier (LA) Bifet et al (2010a), AccuracyWeightedEnsemble (AWE) Wang et al (2003), AccuracyUpdatedEnsemble (AUE) Brzeziński & Stefanowski (2011), Anticipative Dynamic Adaptation to Concept Change (AD) Jaber et al (2013a) and Dynamic Adaptation to Concept Changes(DA) Jaber et al (2013b). (9) 49.3(10) 63.7(9) 78.5(8) 73.4(12) 65.8(5) 10 AUE 76.4 (7) 71.1(7) 72.5 (7) 64.2(7) 82.8 (7) 81.8(10) 65.1 (7) (2) 82.9(6) 85.4(7) 69.5(2) 3…”
Section: Algorithms Used In Studymentioning
confidence: 99%
See 1 more Smart Citation
“…This preference is attributable to its high learning performance and low demands with respect to input preparation and hyper-parameter tuning. [17] The Random forest algorithm is an ensemble learning method that takes average results of several decision trees to classify its samples. As the name denotes, each decision tree is trained on a random training data subset, perhaps using random features as well.…”
Section: A Prediction Resultsmentioning
confidence: 99%
“…Moreover, a linear function on a corresponding subspace is specified for each leaf of the induced tree, and learning these functions is accomplished using stochastic gradient descent. An ensemble version of FIMTDD (adaptive random forest, ARF-Reg) was proposed by Gomes et al (2018), using an online version of bagging for creating the ensemble members (Oza and Russell, 2001).…”
Section: Related Workmentioning
confidence: 99%