2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9206756
|View full text |Cite
|
Sign up to set email alerts
|

On Ensemble Techniques for Data Stream Regression

Abstract: An ensemble of learners tends to exceed the predictive performance of individual learners. This approach has been explored for both batch and online learning. Ensembles methods applied to data stream classification were thoroughly investigated over the years, while their regression counterparts received less attention in comparison. In this work, we discuss and analyze several techniques for generating, aggregating, and updating ensembles of regressors for evolving data streams. We investigate the impact of di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
18
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 18 publications
(18 citation statements)
references
References 27 publications
0
18
0
Order By: Relevance
“…how predictions are combined and how diversity is induced. These were recently empirically analyzed in (Gomes, Montiel, Mastelini, Pfahringer, & Bifet, 2020).…”
Section: Supervised Learningmentioning
confidence: 99%
“…how predictions are combined and how diversity is induced. These were recently empirically analyzed in (Gomes, Montiel, Mastelini, Pfahringer, & Bifet, 2020).…”
Section: Supervised Learningmentioning
confidence: 99%
“…The combination of random subspaces and instance sampling for data streams was recently explored in [22,23]. In [23], the authors presented theoretical insights related to ensembles of Hoeffding trees and several empirical experiments showing the advantage of combining online bagging and random subspaces with an active drift detection strategy.…”
Section: Related Workmentioning
confidence: 99%
“…In [23], the authors presented theoretical insights related to ensembles of Hoeffding trees and several empirical experiments showing the advantage of combining online bagging and random subspaces with an active drift detection strategy. In [22], the authors focused on applying random subspaces and online bagging for regression. The analysis was vastly empirical, and the conclusion was that local feature randomization [20] produced better results in terms of predictive performance when compared to global feature randomization schemes [23].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations