DOI: 10.1007/978-0-387-09695-7_24
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Analysis of One-class Structural Risk Minimization by Support Vector Machines and Nearest Neighbor Rule

Abstract: One-class classification is an important problem with applications in several different areas such as outlier detection and machine monitoring. In this paper we propose a novel method for one-class classification, referred to as kernel k -NNDDSRM. This is a modification of an earlier algorithm, the kNNDDSRM, which aims to make the method able to build more flexible descriptions with the use of the kernel trick. This modification does not affect the algorithm's main feature which is the significant reduction in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“…As illustrated in Table 4, the pairwise comparison matrix for the ten LCFs was constructed by considering expert opinion and similar previous studies [21][22][23]26,27]. From the matrix, the weight vector for the criteria was computed using Equation (14) to Equation (17) and is presented in Table 5. After normalization, the weights for each criterion were derived using Equation (18) and are shown in Table 6.…”
Section: Weights Of Lcfs and Their Subclassesmentioning
confidence: 99%
See 1 more Smart Citation
“…As illustrated in Table 4, the pairwise comparison matrix for the ten LCFs was constructed by considering expert opinion and similar previous studies [21][22][23]26,27]. From the matrix, the weight vector for the criteria was computed using Equation (14) to Equation (17) and is presented in Table 5. After normalization, the weights for each criterion were derived using Equation (18) and are shown in Table 6.…”
Section: Weights Of Lcfs and Their Subclassesmentioning
confidence: 99%
“…Comparative studies have shown that the optimized selection of LSM methods largely depends on the scale, nature, and data availability of the study area [14][15][16].…”
Section: Introductionmentioning
confidence: 99%