2018
DOI: 10.1016/j.ejor.2018.02.009
|View full text |Cite
|
Sign up to set email alerts
|

A new hybrid classification algorithm for customer churn prediction based on logistic regression and decision trees

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
197
0
6

Year Published

2018
2018
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 408 publications
(204 citation statements)
references
References 61 publications
1
197
0
6
Order By: Relevance
“…Previous work [3,4] found that RF performs well in handling datasets with large class imbalance. However, the earlier studies used a large amount of training data, which improves the performance of RF to a certain extent.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous work [3,4] found that RF performs well in handling datasets with large class imbalance. However, the earlier studies used a large amount of training data, which improves the performance of RF to a certain extent.…”
Section: Discussionmentioning
confidence: 99%
“…The results indicated that the RF and GB perform very well and can cope with the class imbalance comparatively well. Caigny et al [4] proposed logit leaf model (LLM) to better classify imbalanced data. LLM was benchmarked against decision trees, LR, RF and logistic model tree (LMT) with regard to the predictive performance using fourteen churn datasets.…”
Section: Introductionmentioning
confidence: 99%
“…Huang and Kechadi (2013) presented a hybrid model-based learning system, which integrated the supervised and unsupervised techniques for predicting customer behaviour. A hybrid classification algorithm for customer churn prediction was proposed on the basis of logistic regression and decision trees (De Caigny, Coussement, & De Bock, 2018). Pendharkar (2009) applied two genetic algorithm-based NN models in the churn prediction context.…”
Section: Customer Churn Predictionmentioning
confidence: 99%
“…Recently, Umayaparvathi et al [22] wanted to find out the best among these CCP techniques by using decision tree, SVM and neural networks. SVM technique was found to be the best among them.…”
Section: Related Workmentioning
confidence: 99%