2018
DOI: 10.1016/j.patcog.2017.09.012
|View full text |Cite
|
Sign up to set email alerts
|

Hyperparameter selection of one-class support vector machine by self-adaptive data shifting

Abstract: With flexible data description ability, one-class Support Vector Machine (OCSVM) is one of the most popular and widely-used methods for one-class classification (OCC). Nevertheless, the performance of OCSVM strongly relies on its hyperparameter selection, which is still a challenging open problem due to the absence of outlier data. This paper proposes a fully automatic OCSVM hyperparameter selection method, which requires no tuning of additional hyperparameter, based on a novel self-adaptive "data shifting" me… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
61
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 66 publications
(62 citation statements)
references
References 20 publications
1
61
0
Order By: Relevance
“…Considering the fact that we do not have a ground truth for activated glia, we cannot rule out that we are over-or under-classifying the activation of glial cells. Tuning hyperparameters for oneclass support vector machines is a critical, and often difficult task, for which a consensus on optimal methodology has yet to be reached 35 . A common technique is to maximize accuracy, while minimizing the number of false-positives, based labeled data (i.e., the class of the data is known) 33,34 .…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Considering the fact that we do not have a ground truth for activated glia, we cannot rule out that we are over-or under-classifying the activation of glial cells. Tuning hyperparameters for oneclass support vector machines is a critical, and often difficult task, for which a consensus on optimal methodology has yet to be reached 35 . A common technique is to maximize accuracy, while minimizing the number of false-positives, based labeled data (i.e., the class of the data is known) 33,34 .…”
Section: Discussionmentioning
confidence: 99%
“…outlier data) can often be difficult to acquire. More advanced methods deploy a variety of strategies which focus on identifying patterns in the one-class itself in order to maximize the capacity to distinguish normal cases from outliers 35 .…”
Section: Discussionmentioning
confidence: 99%
“…Second, we selected the optimal sets of parameters in each kernel. Hyperparameter choice is known to have substantial impacts on classification but it is an open problem for one-class SVMs 22 . Since the parameters for each kernel needed to be selected separately (Supplementary Note), we could not use AUC for assessment.…”
Section: Syssvm2 Optimisation On the Pan-cancer Reference Cohortmentioning
confidence: 99%
“…With the rapid development of computer technologies, business and government organizations create large amounts of data, which need to be processed and analyzed. Over the past decade, to satisfy the urgent need of mining knowledge hidden in the data, numerous machine learning models [1,2] (e.g., decision tree [3], Bayesian network [4,5], support vector machine [6] and Neural network [7]) have been proposed.…”
Section: Introductionmentioning
confidence: 99%