2018
DOI: 10.1007/s10618-018-0576-8
|View full text |Cite
|
Sign up to set email alerts
|

Linear regression for uplift modeling

Abstract: The purpose of statistical modeling is to select targets for some action, such as a medical treatment or a marketing campaign. Unfortunately, classical machine learning algorithms are not well suited to this task since they predict the results after the action, and not its causal impact. The answer to this problem is uplift modeling, which, in addition to the usual training set containing objects on which the action was taken, uses an additional control group of objects not subjected to it. The predicted true … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
15
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
5

Relationship

1
9

Authors

Journals

citations
Cited by 25 publications
(17 citation statements)
references
References 20 publications
1
15
0
Order By: Relevance
“…Second, profit is more difficult to model since this outcome is only observable in a few cases but more closely related to the main objective than website visit, purchase, or revenue. The proposed new approaches in this paper extend findings from the field of binary and revenue uplift modeling (e.g., Surry 1999, 2011;Kane et al 2014;Rudaś and Jaroszewicz 2018;Gubela et al 2020) and from the field of two-stage estimation via sample selection (see, e.g., Heckman 1979) and zero-inflated regression (see, e.g., Lambert 1992;Ridout et al 2001) as well as one-stage parameter estimation via ordinary regression and random forest. We show that the new approaches are well suited to select "best" customers as targets for direct marketing campaigns and improve profit.…”
Section: Introductionsupporting
confidence: 57%
“…Second, profit is more difficult to model since this outcome is only observable in a few cases but more closely related to the main objective than website visit, purchase, or revenue. The proposed new approaches in this paper extend findings from the field of binary and revenue uplift modeling (e.g., Surry 1999, 2011;Kane et al 2014;Rudaś and Jaroszewicz 2018;Gubela et al 2020) and from the field of two-stage estimation via sample selection (see, e.g., Heckman 1979) and zero-inflated regression (see, e.g., Lambert 1992;Ridout et al 2001) as well as one-stage parameter estimation via ordinary regression and random forest. We show that the new approaches are well suited to select "best" customers as targets for direct marketing campaigns and improve profit.…”
Section: Introductionsupporting
confidence: 57%
“…Given that none of the tree-based techniques has been employed for modeling revenue uplift in e-commerce, the evaluation also broadens the scope of empirical results for causal machine learning methods. Causal Forest Cai et al (2011) Two-Step Estimation Procedure Chickering and Heckerman (2000) Uplift Tree with Post-Processing Procedure Diemert et al (2018) x - Guelman et al (2015a) Causal Conditional Inference Tree/Forest Guelman et al (2015b) Uplift Random Forests Gutierrez and Gérardy (2017) -x Hahn et al (2019) Causal Bayesian Regression Trees Hansen and Bowers (2008) x - Hansotia and Rukstales (2002a) Incremental Response Tree Hansotia and Rukstales (2002b) Uplift Tree with the ∆∆ splitting criterion Causal BART Imai and Ratkovic (2013) Uplift Support Vector Machine Jaroszewicz and Rzepakowski (2014) Uplift k-Nearest Neighbors Kane et al (2014) x - Kuusisto et al (2014) Uplift Support Vector Machine Künzel et al (2019) X-Learner Lai et al (2006) x - Lechner (2019) Modified Causal Forests Lo (2002) x - Lo and Pachamanova (2015) Multiple Treatments Logistic Regression Nassif et al (2013) x Oprescu et al (2018) Orthogonal Causal Random Forest Powers et al (2018) Causal boosting Radcliffe and Surry (1999) Uplift Trees Radcliffe and Surry (2011) -x Rzepakowski and Jaroszewicz (2012a) Multiple Treatments Uplift Trees Rzepakowski and Jaroszewicz (2012b) Information Theory-Based Uplift Trees Rudaś and Jaroszewicz (2018) x - Shaar et al (2016) Pessimistic Uplift Shalit et al (2017) Causal Artificial Neural Network Sołtys et al (2015) Uplift Ensemble Methods Su et al (2012) Uplift k-Nearest Neighbors Taddy et al (2016) Causal Bayesian Forests Tian et al (2014) x - Yamane et al (2018) Separate Label Uplift Modeling This study…”
Section: Background and Related Workmentioning
confidence: 99%
“…Each model learns the likelihood of a positive outcome, rather than the what-if difference in behavior (Radcliffe and Surry 2011). Nonetheless, Rudaś and Jaroszewicz (2018) demonstrate that the SMA performs competitively for uplift regression when the sample size is sufficiently large and highly correlated variables are removed.…”
Section: Data Preprocessing Approach: Multitreatment Modified Outcomementioning
confidence: 93%