CPU utilization prediction is key factor for efficient resource management and capacity planning in cloud computing environments. By accurately predicting utilization patterns, resource managers can dynamically distribute workloads to ensure optimal utilization of resources. The load can be equally distributed among virtual machines, leading to a reduction in VM migration and overhead time. This optimization significantly improves the overall performance of the cloud. This proactive approach enables efficient resource usage, minimizing the risk of bottlenecks and maximizing overall system performance. In this paper a Gradient Boosting model with hyper parameter tuning based upon grid search (GBHT) is proposed to enhance CPU utilization prediction. Multiple weak learners are combined in the proposed model to produce a powerful prediction model. and hyperparameters tuning is used to enhance its performance as well as predictive accuracy. Different machine learning and deep learning models are examined side by side. The results clearly demonstrate that the proposed GBHT model significantly contribute superior performance then the traditional machine learning models (SVM, KNN, Random Forest, Gradient Boost), deep learning models (LSTM, RNN, CNN), time series model (Facebook Prophet) and as well as the hybrid models, combining LSTM with Gradient Boost and Gradient Boost with SVM. The proposed model demonstrates superior performance compared to the other models, achieving the lowest MAPE of 0.01% and high accuracy with an R2 score of 1.00.