2017
DOI: 10.1007/978-3-319-57454-7_17
|View full text |Cite
|
Sign up to set email alerts
|

Cost Matters: A New Example-Dependent Cost-Sensitive Logistic Regression Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 11 publications
0
5
0
Order By: Relevance
“…To overcome the issue that traditional cost-sensitive Business failure prediction implemented based on the assumption that misclassification cost matrix is known and fixed, [47] proposed a heterogeneous ensemble selection framework and combined it with a multi-objective optimization algorithm for cost-sensitive business failure prediction, making it adapt to the circumstances when the misclassification cost is uncertain. [48] established an example-dependent cost-sensitive logistic regression algorithm to reduce the misclassification error on minority class samples, thus further saving 10% misclassification cost on a vehicle dataset from a European manufacturer. [49] assigned different costs to each example and proposed an example-dependent cost-sensitive DT to finance-related fields such as credit scoring, fraud detection, and direct marketing, which maximized the cost savings.…”
Section: B Imbalanced Business Failure Predictionmentioning
confidence: 99%
“…To overcome the issue that traditional cost-sensitive Business failure prediction implemented based on the assumption that misclassification cost matrix is known and fixed, [47] proposed a heterogeneous ensemble selection framework and combined it with a multi-objective optimization algorithm for cost-sensitive business failure prediction, making it adapt to the circumstances when the misclassification cost is uncertain. [48] established an example-dependent cost-sensitive logistic regression algorithm to reduce the misclassification error on minority class samples, thus further saving 10% misclassification cost on a vehicle dataset from a European manufacturer. [49] assigned different costs to each example and proposed an example-dependent cost-sensitive DT to finance-related fields such as credit scoring, fraud detection, and direct marketing, which maximized the cost savings.…”
Section: B Imbalanced Business Failure Predictionmentioning
confidence: 99%
“…The MLE principle ensures that the difference between the predicted value and the true value is the smallest; that is, when y i = 1, p i is the largest and when y i = 0, 1 -p i is the largest. The likelihood function for MLE is derived as follows (Günnemann & Pfeffer, 2017).…”
Section: Logistic Regressionmentioning
confidence: 99%
“…As classification errors have different costs, different weights are assigned to different classifications, as shown in Table 2. The weights that classify a positive class as a positive class are c TP , the weights that classify a positive class as a negative class are c FN , the weights that classify a negative class as a positive class are c FP , and the weights that classify a negative class as a negative class are c TN (Günnemann & Pfeffer, 2017).…”
Section: Cost Matrixmentioning
confidence: 99%
See 2 more Smart Citations