2023
DOI: 10.54097/hbem.v5i.5100
|View full text |Cite
|
Sign up to set email alerts
|

Walmart Sales Prediction Based on Decision Tree, Random Forest, and K Neighbors Regressor

Abstract: Sales forecasting is a very important research direction in the business and academic fields, and sales forecasting methods are also in full bloom, such as time series model, machine learning model and deep neural network model. This paper will use three machine learning models: Decision Tree Regressor, Random Forest Regressor, and K Neighbors Regressor to predict Walmart Recruiting - Store Sales data. Using correlation, mean absolute error, and mean square error to evaluate the prediction results of these thr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…For the regressions approach, the performance of following regressors was compared: Linear Regression [10], Ridge Regression [11], Bagging Regressor [12], Random Forest Regressor [13], Gradient Boosting Regressor [14], XGBoost Regressor [15], AdaBoost Regressor [16] and KNeighbors Regressor [17] . Concerning classification, the performance of the following classifiers was compared: Logistic Regression (LogReg) [18], Decision Tree (DT) [19], Random Forest Classifier (RF) [20], XGBoost classifier (XGB) [21], Multi-Layer Perceptron classifier (MLP) [13], Bagging (BC) [22], AdaBoost (ABC) [23], Gradient Boosting (GB) [24], Support Vector (SVC) [25], Gaussian Naïve Bayes (GNB) [26].…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
“…For the regressions approach, the performance of following regressors was compared: Linear Regression [10], Ridge Regression [11], Bagging Regressor [12], Random Forest Regressor [13], Gradient Boosting Regressor [14], XGBoost Regressor [15], AdaBoost Regressor [16] and KNeighbors Regressor [17] . Concerning classification, the performance of the following classifiers was compared: Logistic Regression (LogReg) [18], Decision Tree (DT) [19], Random Forest Classifier (RF) [20], XGBoost classifier (XGB) [21], Multi-Layer Perceptron classifier (MLP) [13], Bagging (BC) [22], AdaBoost (ABC) [23], Gradient Boosting (GB) [24], Support Vector (SVC) [25], Gaussian Naïve Bayes (GNB) [26].…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
“…For the regressions approach, the performance of the following regressors was compared: Linear Regression [10], Ridge Regression [11], Bagging Regressor [12], Random Forest Regressor [13], Gradient Boosting Regressor [14], XGBoost Regressor [15], AdaBoost Regressor [16] and KNeighbors Regressor [17]. Concerning classification, the performance of the following classifiers was compared: Logistic Regression (LogReg) [18], Decision Tree (DT) [19], Random Forest Classifier (RF) [20], XGBoost classifier (XGB) [21], Multi-Layer Perceptron classifier (MLP) [13], Bagging (BC) [22], AdaBoost (ABC) [23], Gradient Boosting (GB) [24], Support Vector (SVC) [25], and Gaussian Naïve Bayes (GNB) [26].…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%