2018
DOI: 10.1016/j.procs.2018.10.328
|View full text |Cite
|
Sign up to set email alerts
|

Anomaly Detection and Classification in Cellular Networks Using Automatic Labeling Technique for Applying Supervised Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(17 citation statements)
references
References 11 publications
0
11
0
Order By: Relevance
“…Hybrid approaches are often used for anomaly detection. For example, Al-Mamuna and Valimaki [ 20 ] proposed a two-stage approach to anomaly detection for quality control in cellular networks. The first stage was to create a one-class SVM model to find outliers in the dataset of key performance indicators (KPIs) from all the cells (sectors of each 2G/3G/4G/5G base station).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Hybrid approaches are often used for anomaly detection. For example, Al-Mamuna and Valimaki [ 20 ] proposed a two-stage approach to anomaly detection for quality control in cellular networks. The first stage was to create a one-class SVM model to find outliers in the dataset of key performance indicators (KPIs) from all the cells (sectors of each 2G/3G/4G/5G base station).…”
Section: Related Workmentioning
confidence: 99%
“… Mind map of the concepts from the literature review used for anomaly detection [ 10 , 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 , 27 , 28 , 29 ]. …”
Section: Figurementioning
confidence: 99%
“…It can recall information for past periods of time and it can cope with lags between critical points in time series. Within this framework, in [15] Recurrent neural networks including LSTM networks consist of repetitive sequences of neural network modules in chain arrangement as shown in Fig. 2.…”
Section: Long Short-term Memorymentioning
confidence: 99%
“…Likewise, in 4-category FAP anomaly classification we reached accuracies more than 80% on the average which is even better than the average results reached in [2] where more easily detectable macro BSs are also involved in. Moreover, our study does not require any data regarding neighboring cells as in [7], and does not require any KPI data pre-processing as in [15], thus has a potential on being operated in run time applications for relatively reduced complexity.…”
Section: Aggregation Decisionmentioning
confidence: 99%
“…But this neural network has one major drawback, namely the vanishing gradient problem [31] [32]. To address this issue, more complex architecture has been chosen, namely, a neural network with a long short-term memory (LSTM) [33]. However, even the LSTM neural network in its standard form does not fit, since the network is unidirectional, and because the arguments of the function can be affected by both earlier and later operations in the program.…”
Section: What Neural Network To Choose?mentioning
confidence: 99%