2018 IEEE Symposium on Computer Applications &Amp; Industrial Electronics (ISCAIE) 2018
DOI: 10.1109/iscaie.2018.8405448
|View full text |Cite
|
Sign up to set email alerts
|

Rule pruning techniques in the ant-miner classification algorithm and its variants: A review

Abstract: Rule-based classification is considered an important task of data classification. The ant-mining rule-based classification algorithm, inspired from the ant colony optimization algorithm, shows a comparable performance and outperforms in some application domains to the existing methods in the literature. One problem that often arises in any rule-based classification is the overfitting problem. Rule pruning is a framework to avoid overfitting. Furthermore, we find that the influence of rule pruning in ant-miner … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0
2

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2
2

Relationship

2
6

Authors

Journals

citations
Cited by 21 publications
(13 citation statements)
references
References 39 publications
0
11
0
2
Order By: Relevance
“…The proposed algorithm relies on only the pheromone which means that no heuristic information is required to construct the clustering solution. To construct a route of attributes, a centroid highly similar to the ant-Miner for the classification problem was employed (Al-Behadili et al, 2019;2018b) Given the use of an example of a data set, in which the number of attribute D is three (3), each centroid contains three attributes. The number of clusters K involves three (3) centroids, in which each single cluster has one single centroid.…”
Section: Methodsmentioning
confidence: 99%
“…The proposed algorithm relies on only the pheromone which means that no heuristic information is required to construct the clustering solution. To construct a route of attributes, a centroid highly similar to the ant-Miner for the classification problem was employed (Al-Behadili et al, 2019;2018b) Given the use of an example of a data set, in which the number of attribute D is three (3), each centroid contains three attributes. The number of clusters K involves three (3) centroids, in which each single cluster has one single centroid.…”
Section: Methodsmentioning
confidence: 99%
“…| | is the number of instances in dataset , and | |is the number of instances in which is the dataset partitioned by the attribute that have different value. Information gain, which measures the homogeneity difference of data before and after a node is split by attribute , is given by equation (2)…”
Section: Decision Treementioning
confidence: 99%
“…Training error can be reduced by increasing data complexity, but it can increasing testing error which causes it cannot predict new data properly. This condition is called overfitting which can occur due to noise or the lack of a representative sample in the training dataset [2].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This cooperative behavior and other unique features prove that ACO is suitable in building a new algorithm. Many NP-complete problems such as the travelling salesman problem (Brezina & Čičková, 2011), fault tolerance (Bukhari, Ku-Mahamud, & Morino, 2017), sequential ordering (Skinderowicz, 2015), grid scheduling (Ku-Mahamud, Din, & Nasir, 2011), and data classification (Al-Behadili, Ku-Mahamud, & Sagban, 2018) have been solved using ACO algorithms. ACO has also been applied to solve routing problems in WSNs because it is suitable to be implemented in static, mobile, and dynamic WSN environments.…”
Section: Aco Work In Wireless Sensor Networkmentioning
confidence: 99%