2022
DOI: 10.1155/2022/1858300
|View full text |Cite
|
Sign up to set email alerts
|

XGBoost-Based E-Commerce Customer Loss Prediction

Abstract: In recent years, with the rapid development of mobile Internet, more and more industries have begun to adopt mobile Internet technology, provide diversified wireless services, and further expand user activity scenarios. The core of reducing customer loss is to identify potential customers. In order to solve the problem of how to accurately predict the loss of customers, this paper put forward an invented method to verify and compared the model with the customer data of an e-commerce enterprise in China. Accord… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…Digital industry research is only a branch field in China, and there is little research on the actual progress of development indicators. CII will be used in this study, but the progress of ECI is slow [15].…”
Section: Introductionmentioning
confidence: 99%
“…Digital industry research is only a branch field in China, and there is little research on the actual progress of development indicators. CII will be used in this study, but the progress of ECI is slow [15].…”
Section: Introductionmentioning
confidence: 99%
“…To establish a predictive model, the eleven most common machine learning models were trained using the final selected clinical factors. These machine models include SVM ( 13 ), KNN ( 14 ), RandomForest ( 15 ), ExtraTrees ( 16 ), XGBoost ( 17 ), LightGBM ( 18 ), NaiveBayes ( 19 ), AdaBoost ( 20 ), GradientBoosting ( 21 ), LR ( 22 ), MLP ( 23 ).…”
Section: Methodsmentioning
confidence: 99%
“…where n 0 = ∑ I ( x i ∈ o ), , and . (B) The XGBoost algorithm can generate a second-order Taylor expansion of the utilized loss function and obtain the optimal solution for the regular term outside the loss function ( 50 , 51 ). The larger the weight of a feature and the more times it is selected by the boosted tree, the more important the feature is considered to be ( 52 , 53 ).…”
Section: Methodsmentioning
confidence: 99%