2018 International Conference on Information and Communications Technology (ICOIACT) 2018
DOI: 10.1109/icoiact.2018.8350710
|View full text |Cite
|
Sign up to set email alerts
|

Taxpayer compliance classification using C4.5, SVM, KNN, Naive Bayes and MLP

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0
3

Year Published

2018
2018
2024
2024

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 22 publications
(11 citation statements)
references
References 11 publications
0
5
0
3
Order By: Relevance
“…Em [Jupri and Sarno 2018] são comparados 5 algoritmos de classificação no contexto de conformidade fiscal, são eles: Decision Trees C4.5, Support Vector Machine (SVM), KNN, Naive Bayes e Multilayer Preceptron (MPL). O algoritmo que obteve melhor resultado foi o Decision Trees C4.5, com 98.93% de acurácia, enquanto que o MPL teve 89.26% de acurácia, ficando na última posição do ranking.…”
Section: Avaliação Dos Resultadosunclassified
See 2 more Smart Citations
“…Em [Jupri and Sarno 2018] são comparados 5 algoritmos de classificação no contexto de conformidade fiscal, são eles: Decision Trees C4.5, Support Vector Machine (SVM), KNN, Naive Bayes e Multilayer Preceptron (MPL). O algoritmo que obteve melhor resultado foi o Decision Trees C4.5, com 98.93% de acurácia, enquanto que o MPL teve 89.26% de acurácia, ficando na última posição do ranking.…”
Section: Avaliação Dos Resultadosunclassified
“…Supervisionado , [Jupri and Sarno 2018], [Xiangyu et al 2018], [Zhu et al 2018], [Wu et al 2019], [Rahimikia et al 2017], [Goumagias et al 2018], [Kleanthous and Chatzis 2019] 8 Não supervisionado [de Roux et al 2018], [Wei et al 2019], [Mehta et al 2019], [Vanhoeyveld et al 2020], [Mathews et al 2018], [Mehta et al 2018] 6…”
Section: Técnica Estudo Totalmentioning
confidence: 99%
See 1 more Smart Citation
“…Then for the split between training data and testing data using K-Fold Cross Validation method. So the data are divided into k fold and then will be executed classification process as much as the k and for the testing data is selected from one of k fold and training data is fold which are not used as the data testing [16]. The selection of data testing per round is selected in sequence starting from the folds 1, see Figure 3 for the illustration.…”
Section: Comparing Resultsmentioning
confidence: 99%
“…For the KNN classifier, the recognition result is decided based on a majority vote of k NN, as described in [26]. For the Naïve Bayes classifier, decision‐making on the Bayes theory and independent input variables are applied, as implemented in [27]. In this experiment, we assume that the prior probabilities of these gestures are the same.…”
Section: Resultsmentioning
confidence: 99%