2013
DOI: 10.1016/j.neunet.2013.02.007
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic class imbalance learning for incremental LPSVM

Abstract: a b s t r a c tLinear Proximal Support Vector Machines (LPSVMs), like decision trees, classic SVM, etc. are originally not equipped to handle drifting data streams that exhibit high and varying degrees of class imbalance. For online classification of data streams with imbalanced class distribution, we propose a dynamic class imbalance learning (DCIL) approach to incremental LPSVM (IncLPSVM) modeling. In doing so, we simplify a computationally non-renewable weighted LPSVM to several core matrices multiplying tw… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
11
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(11 citation statements)
references
References 36 publications
0
11
0
Order By: Relevance
“…In a further set of experiments, we compare our proposed approaches HE_S, HE_A, HE_FUS and HE_WFUS with the best method introduced in [50], named DCIL, which was shown in that paper to outperform several other methods, such as LPSVM, SVM, SMOTE+LPSVM, weighted LPSVM, weighted SVM and partitioning ensemble SVM. In these tests, we consider the same UCI datasets used with DCIL, along with the same testing protocol (we adopt the testing protocol where the highest degree of class imbalance occurs, which is called 'Exp 3' in [50]). Each experiment consists of 20 rounds of independent tests using the F-measure as the performance indicator (we use this same measure so that our results can be directly compared with those in [50]).…”
Section: Comparisons With the Literaturementioning
confidence: 99%
See 1 more Smart Citation
“…In a further set of experiments, we compare our proposed approaches HE_S, HE_A, HE_FUS and HE_WFUS with the best method introduced in [50], named DCIL, which was shown in that paper to outperform several other methods, such as LPSVM, SVM, SMOTE+LPSVM, weighted LPSVM, weighted SVM and partitioning ensemble SVM. In these tests, we consider the same UCI datasets used with DCIL, along with the same testing protocol (we adopt the testing protocol where the highest degree of class imbalance occurs, which is called 'Exp 3' in [50]). Each experiment consists of 20 rounds of independent tests using the F-measure as the performance indicator (we use this same measure so that our results can be directly compared with those in [50]).…”
Section: Comparisons With the Literaturementioning
confidence: 99%
“…In these tests, we consider the same UCI datasets used with DCIL, along with the same testing protocol (we adopt the testing protocol where the highest degree of class imbalance occurs, which is called 'Exp 3' in [50]). Each experiment consists of 20 rounds of independent tests using the F-measure as the performance indicator (we use this same measure so that our results can be directly compared with those in [50]).…”
Section: Comparisons With the Literaturementioning
confidence: 99%
“…A chunk incremental algorithm based on the cost-sensitive SVM with extended hinge loss was proposed by Gu et al [9]. Pang et al proposed an incremental wLPSVM to tackle the dynamic class imbalance problem [10]. The incremental cost-sensitive model remains an exciting and required research topic for online classification.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, due to the difference in number of samples belonging to the past and the new tasks, we argue that the class imbalance problem exacerbates the overfitting on the new tasks, which leads to accuracy drop. The class imbalance problem, however, is relatively less explored in the IL context [9], [38], [39]. Ditzler et al propose a dynamically-weighted consult and vote (DW-CAV) algorithm for an ensemble of classifiers for better plasticity [39].…”
Section: Introductionmentioning
confidence: 99%