2022
DOI: 10.1007/s00180-022-01301-9
|View full text |Cite
|
Sign up to set email alerts
|

Fair evaluation of classifier predictive performance based on binary confusion matrix

Abstract: Evaluating the ability of a classifier to make predictions on unseen data and increasing it by tweaking the learning algorithm are two of the main reasons motivating the evaluation of classifier predictive performance. In this study the behavior of Balanced $$AC_1$$ A C 1  — a novel classifier accuracy measure — is investigated under different class imbalance condition… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0
1

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 34 publications
0
2
0
1
Order By: Relevance
“…A confusion matrix is used to evaluate the model's performance in a classification problem. Finding three crucial factors, namely accuracy, sensitivity, and specificity, is done by using the confusion matrix's elements [25].…”
Section: Performance Measuresmentioning
confidence: 99%
“…A confusion matrix is used to evaluate the model's performance in a classification problem. Finding three crucial factors, namely accuracy, sensitivity, and specificity, is done by using the confusion matrix's elements [25].…”
Section: Performance Measuresmentioning
confidence: 99%
“…''Positive'' indicates when the ramp event is forecasted to occur, and ''Negative'' refers to when the ramp event is forecasted to not occur. By combining T or F and P or N according to whether the forecasting and the actual measurement match, the forecasting result is represented as a total of four cases: TP, FN, FP, and TN [28].…”
Section: Performance Validation Metric For Ramp Event Detection 1) Co...mentioning
confidence: 99%
“…Hasil prediksi dari proses training dan testing ditunjukkan pada gambar berikut: Accuracy, sensitivity (recall), precision, dan F1-score adalah beberapa ukuran kinerja prediktif yang umum digunakan berdasarkan confusion matriks biner [21]. Setiap ukuran berkaitan dengan aspek kinerja tertentu, sehingga ukuran kinerja yang sesuai untuk masalah yang dihadapi umumnya dipilih sesuai dengan aspek kinerja yang akan diselidiki.…”
Section: Evaluasi Hasilunclassified