2023
DOI: 10.1016/j.ejor.2022.03.049
|View full text |Cite
|
Sign up to set email alerts
|

To do or not to do? Cost-sensitive causal classification with individual treatment effect estimates

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 32 publications
(50 reference statements)
0
6
0
Order By: Relevance
“…Compared with the SMOTE, which randomly selects the center and generates new samples in the vicinity, the IFCM-SMOTE-SVM can generate more realistic and reliable samples. Therefore, the new kernel space-based SVM is proposed by combining the IFCM-SMOTE-SVM with the KS-SMOTE-SVM, named KS-IFCM-SMOTE-SVM, and bringing Eq (26) into Eq (29):…”
Section: Plos Onementioning
confidence: 99%
See 1 more Smart Citation
“…Compared with the SMOTE, which randomly selects the center and generates new samples in the vicinity, the IFCM-SMOTE-SVM can generate more realistic and reliable samples. Therefore, the new kernel space-based SVM is proposed by combining the IFCM-SMOTE-SVM with the KS-SMOTE-SVM, named KS-IFCM-SMOTE-SVM, and bringing Eq (26) into Eq (29):…”
Section: Plos Onementioning
confidence: 99%
“…By analyzing the intrinsic relationships existing between samples through non-random sampling, the original feature information is maintained in the sampling process, making the classification model less prone to overfitting and misclassification during the training process [ 24 , 25 ]. Cost-sensitive learning reduces the error of SVM in classifying minority classes by reducing the overall cost of misclassification and improves the classification of unbalanced data [ 26 ]. Vanderschueren T [ 27 ] improved the general learning model into the cost-sensitive learning model by calculating the ideal cost for each sample and modifying the original sample class to obtain the new sample set.…”
Section: Introductionmentioning
confidence: 99%
“…Due to the importance of the CATE problem, there are many other publications devoted to this problem [28][29][30][31].…”
Section: Related Workmentioning
confidence: 99%
“…Next, the work of Verbeke et al [21] analysed the impact of following prognostic recommendations. They designed a decision support model.…”
Section: Introductionmentioning
confidence: 99%