Current LTE networks are experiencing significant growth in the number of users worldwide. The use of data services for online browsing, e-learning, online meetings and initiatives such as smart cities means that subscribers stay connected for long periods, thereby saturating a number of signalling resources. One of such resources is the Radio Resource Connected (RRC) parameter, which is allocated to eNodeBs with the aim of limiting the number of connected simultaneously in the network. The fixed allocation of this parameter means that, depending on the traffic at different times of the day and the geographical position, some eNodeBs are saturated with RRC resources (overused) while others have unused RRC resources. However, as these resources are limited, there is the problem of their underutilization (non-optimal utilization of resources at the eNodeB level) due to static allocation (manual configuration of resources). The objective of this paper is to design an efficient machine learning model that will take as input some key performance indices (KPIs) like traffic data, RRC, simultaneous users, etc., for each eNodeB per hour and per day and accurately predict the number of needed RRC resources that will be dynamically allocated to them in order to avoid traffic and financial losses to the mobile network operator. To reach this target, three machine learning algorithms have been studied namely: linear regression, convolutional neural networks and long short-term memory (LSTM) to train three models and evaluate them. The model trained with the LSTM algorithm gave the best performance with 97% accuracy and was therefore implemented in the proposed solution for RRC resource allocation.