In 6G communication, many state-of-the-art machine learning algorithms are going to be implemented to enhance the performances, including the latency property. In this thesis, we apply Buffer Status Report(BSR) prediction to the uplink scheduling process. The BSR does not include information for data arriving after the transmission of this BSR. Therefore, the base station does not allocate resources for the new arrival data, which increases the latency. To solve this problem, we decide to make BSR predictions at the base station side and allocate more resources than BSRs indicate. It is hard to make an accurate prediction since there are so many features influence the BSRs. Another challenge in this task is that the time intervals are tremendously short (in the order of milliseconds). In other traffic predictions, the traffic data in a long term, such as in a week and month, can be used to predict the periodicity and trend. In addition, many external features, such as the weather, can boost the prediction results. However, when the time is short, it is hard to leverage these features.The datasets provided by Ericsson are collected from real networks. After cleaning the data, we convert the time series forecasting problem into a supervised learning problem. State-of-the-art algorithms such as Random Forest(RF), XGboost, and Long Short Term Memory(LSTM) are leveraged to predict the data arrival rate, and one K-Fold Cross-Validation is followed to validate the models. The results show that even the time intervals are small, the data arrival rate can be predicted and the downlink data, downlink quality indicator and rank indicator can boost the forecasting performance.