Over time, higher demand for data speed and quality of service by an increasing number of mobile network subscribers has been the major challenge in the telecommunication industry. This challenge is the result of an increasing population of human race and the continuous advancement in mobile communication industry, which has led to network traffic congestion. In an effort to solve this problem, the telecommunication companies released the Fourth Generation Long Term Evolution (4G LTE) network and afterwards the Fifth Generation Long Term Evolution (5G LTE) network that laid claims to have addressed the problem. However, machine learning techniques, which are very effective in prediction, have proven to be capable of great importance in the extraction and processing of information from the subscriber’s perceptions about the network. The objective of this work is to use machine learning models to predict the existence of traffic congestion in LTE networks as users perceived it. The dataset used for this study was gathered from some students over a period of two months using Google form and thereafter, analysed using the Anaconda machine learning platform. This work compares the results obtained from the four machine learning techniques employed that are k-Nearest Neighbour, Support Vector Machine, Decision Tree and Logistic Regression. The performance evaluation of the ML techniques was done using standard metrics to ascertain the real existence of congestion. The result shows that k-Nearest Neighbour outperforms all other techniques in predicting the existence of traffic congestion. This study therefore has shown that the majority of LTE network users experience traffic congestion.