2019
DOI: 10.1088/1742-6596/1237/2/022150
|View full text |Cite
|
Sign up to set email alerts
|

Research on the incremental learning SVM algorithm based on the improved generalized KKT condition

Abstract: In order to adapt to the classification of the large-scale data and the dynamic data, this paper proposes an incremental learning strategy of SVM called GGKKT–ISVM algorithm based on the generalized KKT condition. The algorithm sets the generalized extension factors by the samples distribution density in order to make the useful samples become new support vectors, and it trains a new classifier. Then this algorithm modifies the classifier secondly, and it can not only keep the historical classification informa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 3 publications
0
3
0
Order By: Relevance
“…For the classification problem of imbalanced data, besides the improvement of the classical algorithms and the proposal of novel algorithms, some ensemble approaches have been proposed by integrating the classical algorithms with novel strategies, such as the three-way decision ensemble [21] and the samples' selection strategy [22]. In addition, the sampling algorithms for multi-class imbalanced data have been paid more and more attention in recent years.…”
Section: Introductionmentioning
confidence: 99%
“…For the classification problem of imbalanced data, besides the improvement of the classical algorithms and the proposal of novel algorithms, some ensemble approaches have been proposed by integrating the classical algorithms with novel strategies, such as the three-way decision ensemble [21] and the samples' selection strategy [22]. In addition, the sampling algorithms for multi-class imbalanced data have been paid more and more attention in recent years.…”
Section: Introductionmentioning
confidence: 99%
“…. Based on the abovementioned experiments, this round of experiment compares the training results of Simple-ISVM [22], KKT-ISVM [23,24], CSV-ISVM [14], GGKKT-ISVM [25], CD-ISVM [26], HDFC-ISVM, and HDFC-ISVM * (HDFC-ISVM * algorithm is the improved algorithm based on the original HDFC-ISVM algorithm) for the abovementioned 12 datasets in Table 12. In this experiment, for all the algorithms mentioned above, the initial training datasets contain 500 samples.…”
Section: Te Experimental Resultsmentioning
confidence: 99%
“…Previously, many classical ISVM learning algorithms have been proposed, including Simple_ISVM [22], KKT_ISVM [23,24], CSV_ISVM [14], GGKKT_ISVM [25], CD_ISVM [26], and other ISVM algorithms mentioned above, these algorithms provide the diferent selection methods of incremental learning training samples from diferent perspectives. However, the ability of the classifer to gradually accumulate the spatial distribution knowledge of samples is still not fully developed, so the accuracy and efciency can be further improved.…”
Section: Description Of the Original Hdfc-isvm Algorithmmentioning
confidence: 99%