2017
DOI: 10.1016/j.infsof.2017.07.009
|View full text |Cite
|
Sign up to set email alerts
|

An anomaly detection system based on variable N-gram features and one-class SVM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
34
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 91 publications
(34 citation statements)
references
References 22 publications
0
34
0
Order By: Relevance
“…Among the most frequently used techniques in time series classification, rare event logistic regression, an adaptation of the logistic regression for this learning scenario, is a popular choice (King et al, 2001;Theofilatos et al, 2016;Ren et al, 2016;Van Den Eeckhaut et al, 2006). However, techniques such as Kullback-Leibler divergence to discriminate between rare and normal events (Xu et al, 2016), long-short term neural networks (Zhang et al, 2017), rulebased classification learned with genetic algorithms (Weiss and Hirsh, 1998), multiple-instance naïve Bayes (Murray et al, 2005), Poisson Processes (Dzierma and Wehrmann, 2010), support vector data regression with surrogate functions (Bourinet, 2016), Bayesian networks (Cheon et al, 2009) or support vector machines (Khreich et al, 2017) have been successfully adapted for this learning scenario.…”
Section: Rare Event Detectionmentioning
confidence: 99%
See 2 more Smart Citations
“…Among the most frequently used techniques in time series classification, rare event logistic regression, an adaptation of the logistic regression for this learning scenario, is a popular choice (King et al, 2001;Theofilatos et al, 2016;Ren et al, 2016;Van Den Eeckhaut et al, 2006). However, techniques such as Kullback-Leibler divergence to discriminate between rare and normal events (Xu et al, 2016), long-short term neural networks (Zhang et al, 2017), rulebased classification learned with genetic algorithms (Weiss and Hirsh, 1998), multiple-instance naïve Bayes (Murray et al, 2005), Poisson Processes (Dzierma and Wehrmann, 2010), support vector data regression with surrogate functions (Bourinet, 2016), Bayesian networks (Cheon et al, 2009) or support vector machines (Khreich et al, 2017) have been successfully adapted for this learning scenario.…”
Section: Rare Event Detectionmentioning
confidence: 99%
“…In most of the papers that use the term novelty to describe the abnormalities, the model is learned using a dataset that contains only one class. For instance, in Khreich et al (2017), system call traces are classified as novel or normal. A novel instance corresponds to an unsupported or unexpected system call trace.…”
Section: Novelty Detectionmentioning
confidence: 99%
See 1 more Smart Citation
“…The proposed method uses the output results from one-class classifier types. Classifiers like the OneClassSVM classifier are proven to be efficient in outliers detection [18]. We use one-class classifiers because our method only needs to differentiate good (expected) metadata reads from abnormal ones.…”
Section: Classifiers Comparisonmentioning
confidence: 99%
“…More importantly, adopting a probabilistic policy in novelty detection enables us to estimate the generative probability density function of the normal data, which can cover a wide and heterogeneous spectrum of normal samples. These advantages made novelty detection techniques very successful in many applications ranging from fraud detection [ 17 , 18 ], medical diagnosis [ 19 , 20 , 21 ], fault detection [ 22 , 23 ], to anomaly and outlier detection in sensor networks [ 24 , 25 ], video surveillance [ 26 , 27 ] and text mining [ 28 , 29 ].…”
Section: Introductionmentioning
confidence: 99%