Proceedings of the 22nd ACM International Conference on Information &Amp; Knowledge Management 2013
DOI: 10.1145/2505515.2505619
|View full text |Cite
|
Sign up to set email alerts
|

Combining one-class classifiers via meta learning

Abstract: Selecting the best classifier among the available ones is a difficult task, especially when only instances of one class exist. In this work we examine the notion of combining one-class classifiers as an alternative for selecting the best classifier. In particular, we propose two one-class classification performance measures to weigh classifiers and show that a simple ensemble that implements these measures can outperform the most popular one-class ensembles. Furthermore, we propose a new one-class ensemble sch… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0
3

Year Published

2013
2013
2020
2020

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 29 publications
(24 citation statements)
references
References 32 publications
0
21
0
3
Order By: Relevance
“…Using the well known OCSVM method with default values for parameters σ and ν we distinguish real from false alarms with the use of a recursive k-means clustering method. This is very different from all previous methods that required pre-selection of parameters with the use of cross-validation or other methods that ensemble of One class classifiers [9].…”
Section: Introductionmentioning
confidence: 81%
“…Using the well known OCSVM method with default values for parameters σ and ν we distinguish real from false alarms with the use of a recursive k-means clustering method. This is very different from all previous methods that required pre-selection of parameters with the use of cross-validation or other methods that ensemble of One class classifiers [9].…”
Section: Introductionmentioning
confidence: 81%
“…Other related work further indicated that it was impossible to determine if meta-learning methods can outperform other methods such as weighting as the best method for the problem at hand depends largely upon the dataset being used and the prior knowledge to be gained [59,61]. Moreover, among the weighting methods, mean, median and majority voting are more commonly used and tend to perform better more often than others according to Menahem et al [62]. Therefore in this research, we select the weighting method of weighted majority vote because of its impressive performance and the employment of small facial image datasets (<1000 instances) in our application domain.…”
Section: Facial Emotion Recognition Using An Adaptive Ensemble Classimentioning
confidence: 99%
“…According to [13], up until 2008, this research field was relatively new and had not been thoroughly explored. In particular, in the setup of diverse ensemble members, only two combining methods were considered for one-class problems: the fix-rule [42,18] and meta-learning [29] ensembles.…”
Section: One-class Ensemble Learningmentioning
confidence: 99%
“…Lastly , ACTIDS uses the TUPSO ensemble scheme [29] to train the CDS's classifier, which is responsible for combining the Sensors' local anomaly score during the prediction phase. [29]), f (·), to P m . Each such aggregation represents a single feature (i.e., column) in the combiner's train-set.…”
Section: Training Phasementioning
confidence: 99%