2017 International Joint Conference on Neural Networks (IJCNN) 2017
DOI: 10.1109/ijcnn.2017.7965964
|View full text |Cite
|
Sign up to set email alerts
|

A partial labeling framework for multi-class imbalanced streaming data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…It analyzes the data distribution from the perspectives of time and space, and presents a self-adaptive dual weighting strategy, which includes the weights built on the class imbalance ratio at the time level and the weights based on the probability density of samples at the space level. In addition, the RLS-Multi (Reduced Labeled Samples-Multiple class) [105] approach was proposed, where it uses DWM (Dynamic Weighted Majority) in an ensemble voting. It handles well for classification with multiple class imbalanced in conjunction with partially labeled (namely label missing) data stream.…”
Section: ) Pt Based Multi-label Data Stream Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…It analyzes the data distribution from the perspectives of time and space, and presents a self-adaptive dual weighting strategy, which includes the weights built on the class imbalance ratio at the time level and the weights based on the probability density of samples at the space level. In addition, the RLS-Multi (Reduced Labeled Samples-Multiple class) [105] approach was proposed, where it uses DWM (Dynamic Weighted Majority) in an ensemble voting. It handles well for classification with multiple class imbalanced in conjunction with partially labeled (namely label missing) data stream.…”
Section: ) Pt Based Multi-label Data Stream Methodsmentioning
confidence: 99%
“…And in [117], based on a basic generator, there is another multi-label stream data generation framework. And a common method is introduced in [105], [116], [120], which models concept drifts by creating a new multi-label data stream with shuffling labels. Contrary to the aforementioned methods, authors in [35] consider two categories of label dependencies from the angle of probability, namely conditional dependence and unconditional dependence.…”
Section: Synthetic Datasetsmentioning
confidence: 99%
“…Among the studies analysed in this work, 65.71 percent use an ensemble approach to adapt to the new concept. An ensemble-based approach is consisting of multiple trained classifiers (Arabmakki et al, 2017;Khandekar & Shrinath, 2022;Palli et al, 2023;Vafaie et al, 2020;Vasantha et al, 2019) when it receives a drift signal, the ensemble-based approach creates a new classifier and trains it on the latest data and adds it to the ensemble. It removes the classifier with poor performance.…”
Section: Ensemble Approachmentioning
confidence: 99%