2021
DOI: 10.19101/ijatee.2021.874615
|View full text |Cite
|
Sign up to set email alerts
|

A two-phase feature selection technique using mutual information and XGB-RFE for credit card fraud detection

Abstract: De Sá et al. [5] developed a customized classification algorithm that automatically generates the Bayesian network classifier to manage the class imbalance. Even though effective methods such as data level, algorithm level, hybrid and cost-sensitive learning is proposed by researchers to normalise the imbalanced

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 32 publications
0
3
0
Order By: Relevance
“…The outcome of the proposed model exhibits a promising f1-score of 91%. The experimental evaluation results of AE-XGB were compared with other related methods such as AdaBoost+XGB [48], Deep-Q NR [49] and XGBoost [50] of same dataset. The comparison result shows that AE-XGB with 𝜃=0.3 had the good precision score of 91% and high recall of 90% with a highest f1-score of 91%.…”
Section: Discussionmentioning
confidence: 99%
“…The outcome of the proposed model exhibits a promising f1-score of 91%. The experimental evaluation results of AE-XGB were compared with other related methods such as AdaBoost+XGB [48], Deep-Q NR [49] and XGBoost [50] of same dataset. The comparison result shows that AE-XGB with 𝜃=0.3 had the good precision score of 91% and high recall of 90% with a highest f1-score of 91%.…”
Section: Discussionmentioning
confidence: 99%
“…Priscilla [11] et al proposed a two-stage feature selection method using mutual in-formation in the first stage and recursive feature elimination (RFE) in the second stage to eliminate redundant features. Pashaei [12] et al used minimum redundancy maximum relevance (mRMR) as a first-level filter and then introduced simulated annealing and crossover operators into a binary arithmetic optimization algorithm to select the mini-mum set of informative genes.…”
Section: Related Workmentioning
confidence: 99%
“…The identification of irrelevant or redundant features is a challenging task. Feature selection methods are used to choose a relevant subset of parameters that is sufficient to symbolize the original data [24]. The model with a reduced feature set will perform better at the time complexity.…”
Section: Literature Reviewmentioning
confidence: 99%