2022
DOI: 10.1016/j.eswa.2021.115920
|View full text |Cite
|
Sign up to set email alerts
|

Using binary classifiers for one-class classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 18 publications
(5 citation statements)
references
References 32 publications
0
5
0
Order By: Relevance
“…In cases of extreme class imbalance associated with obtaining and labelling the positive class, one class classification (OCC), also known as anomaly detection, is a useful alternative as it only requires the presence of negative class examples for training the classifier. Essentially, this unsupervised method learns a decision boundary around the target class (inliers) and identifies instances outside of it as anomalies or outliers [30][31][32][33] . Anomaly detection has been applied to Alzheimer's disease diagnosis 34 , identifying acute myeloid leukemia associated genes 35 , and abnormal skin tissue detection 36 .…”
Section: MLmentioning
confidence: 99%
“…In cases of extreme class imbalance associated with obtaining and labelling the positive class, one class classification (OCC), also known as anomaly detection, is a useful alternative as it only requires the presence of negative class examples for training the classifier. Essentially, this unsupervised method learns a decision boundary around the target class (inliers) and identifies instances outside of it as anomalies or outliers [30][31][32][33] . Anomaly detection has been applied to Alzheimer's disease diagnosis 34 , identifying acute myeloid leukemia associated genes 35 , and abnormal skin tissue detection 36 .…”
Section: MLmentioning
confidence: 99%
“…At test time, the classifier with the highest confidence score is chosen as the predicted class, and another approach is to use a oneversus-one (OVO) strategy, in which a separate binary SVM classifier is trained for every pair of classes, at test time, the class with the most votes from the individual classifiers is chosen as the predicted classes [30]. Another approach is to use a multiclass SVM algorithm; this approach uses a single SVM classifier that is trained to distinguish between all of the classes at once, rather than training multiple binary classifiers [31].…”
Section: Classification With Svmmentioning
confidence: 99%
“…e method has produced excellent results on various datasets. Kanag [23] uses the clustering technique to many clusters.…”
Section: Literature Surveymentioning
confidence: 99%
“…The method has produced excellent results on various datasets. Kanag [ 23 ] uses the clustering technique to many clusters. Therese clusters are used using the one-against-rest method to create many binary-classifiers.…”
Section: Literature Surveymentioning
confidence: 99%