“…So we developed code that iterates over various regression models to forecast a temperature series using the obtained coefficients B 1 and B 2 . These models include Decision Tree 52 , Bagging 53 , AdaBoost 54 , XGBoost 55 , SVR (Support Vector Regression) 56 , Gradient Boosting 57 , Linear Regression 58 , Random Forest 59 , Ridge 60 , LassoLars 61 , RANSAC 62 , SVR Poly 63 , Elastic Net CV 64 , OMP (Orthogonal Matching Pursuit) 65 , Tweedie 66 , Gaussian Process 67 , Passive Aggressive 68 , CatBoost 69 , LightGBM 70 (Light Gradient Boosting Machine), Hist Gradient Boosting 71 (Histogram-based Gradient Boosting), Bayesian Ridge 72 , PA Hinge 73 (Passive Aggressive with Hinge Loss), Extra Trees 74 , Theil-Sen 75 , Poisson 76 , GLM (Generalized Linear Model) 77 , Quantile Regression 78 , and Gamma 79 . For each model, it trains a polynomial regression with different degrees and evaluates its performance metrics such as mean squared error (MSE), R-squared (R 2 ), root mean squared error (RMSE), normalized MSE (NMSE), mean absolute error (MAE), and mean percentage error (MPE).…”