This article presents an automatic malfunction detection framework based on data mining approach to analysis of network event sequences. The considered environment is Long Term Evolution (LTE) for Universal Mobile Telecommunications System (UMTS) with sleeping cell caused by random access channel failure. Sleeping cell problem means unavailability of network service without triggered alarm. The proposed detection framework uses N-gram analysis for identification of abnormal behavior in sequences of network events. These events are collected with Minimization of Drive Tests (MDT) functionality standardized in LTE. Further processing applies dimensionality reduction, anomaly detection with K-Nearest Neighbors (K-NN), cross-validation, post-processing techniques and efficiency evaluation. Different anomaly detection approaches proposed in this paper are compared against each other with both classic data mining metrics, such as F-score and Receiver Operating Characteristic (ROC) curves, and a newly proposed heuristic approach. Achieved results demonstrate that the suggested method can be used in modern performance monitoring systems for reliable, timely and automatic detection of random access channel sleeping cells.